What is Post Keynesian Economics?

30 May, 2017 at 12:41 | Posted in Economics | 6 Comments

encyc John Maynard Keynes’s 1936 book The General Theory of Employment, Interest, and Money attempted to overthrow classical theory and revolutionize how economists think about the economy. Economists who build upon Keynes’s General Theory to analyze the economic problems of the twenty-first-century global economy are called Post Keynesians. Keynes’s “principle of effective demand” (1936, chap. 2) declared that the axioms underlying classical theory were not applicable to a money-using, entrepreneurial economic system. Consequently, the mainstream theory’s “teaching is misleading and disastrous if we attempt to apply it to the facts of experience” (Keynes 1936, p. 3). To develop an economic theory applicable to a monetary economy, Keynes suggested rejecting three basic axioms of classical economics (1936, p. 16).

Unfortunately, the axioms that Keynes suggested for rejection are still part of the foundation of twenty-first-century mainstream economic theory. Post Keynesians have thrown out the three axioms that Keynes suggested rejecting in The General Theory. The rejected axioms are the ergodic axiom, the gross-substitution axiom, and the neutral-money axiom … Only if these axioms are rejected can a model be developed that has the following characteristics:

•Money matters in the long and short run, that is, changes in the money supply can affect decisions that determine the level of employment and real economic output.

•As the economic system moves from an irrevocable past to an uncertain future, decision makers recognize that they make important, costly decisions in uncertain conditions where reliable, rational calculations regarding the future are impossible.

•People and organizations enter into monetary contracts. These money contracts are a human institution developed to efficiently organize time-consuming production and exchange processes. The money-wage contract is the most ubiquitous of these contracts.

•Unemployment, rather than full employment, is a common laissez-faire situation in a market-oriented, monetary production economy.

•The ergodic axiom postulates that all future events are actuarially certain, that is, that the future can be accurately forecasted from an analysis of existing market data. Consequently, this axiom implies that income earned at any employment level is entirely spent either on produced goods for today’s consumption or on buying investment goods that will be used to produce goods for the (known) future consumption of today’s savers. In other words, orthodox theory assumes that all income is always immediately spent on producibles, so there is never a lack of effective demand for things that industry can produce at full employment … Post Keynesian theory rejects the ergodic axiom.

In Post Keynesian theory … people recognize that the future is uncertain (nonergodic) and cannot be reliably predicted.

Paul Davidson

The financial crisis of 2007-08 hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even made it conceivable?

There are many who have ventured to answer that question. And they have come up with a variety of answers, ranging from the exaggerated mathematization of economics, to irrational and corrupt politicians.

0But the root of our problem goes much deeper. It ultimately goes back to how we look upon the data we are handling. In ‘modern’ macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical and New ‘Keynesian’ — variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the ‘data-generating process’ – if we do not have the ‘true’ model – the whole edifice collapses. And of course it has to. I mean, who honestly believes that we should have access to this mythical Holy Grail, the data-generating process?

‘Modern’ macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

4273570080_b188a92980This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30%, and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But as Keynes convincingly argued in his monumental Treatise on Probability (1921), this is not always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally good – model the chance is perhaps somewhere around 40%. We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

In the end this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only ‘rational’ one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

wrongrightSome macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better – how much bigger chance that we do not lull us into the comforting thought that we know everything and that everything is measurable and we have everything under control – if instead we could just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing — economic disaster.

Speech by the British Prime Minister to the American President

29 May, 2017 at 17:49 | Posted in Varia | Comments Off on Speech by the British Prime Minister to the American President

 

Looking forward to hear Theresa May deliver something similar to Donald Trump …

Chicago economics — a dangerous pseudo-scientific zombie

29 May, 2017 at 14:58 | Posted in Economics | 5 Comments

Savings-and-InvestmentsEvery dollar of increased government spending must correspond to one less dollar of private spending. Jobs created by stimulus spending are offset by jobs lost from the decline in private spending. We can build roads instead of factories, but fiscal stimulus can’t help us to build more of both. This form of “crowding out” is just accounting, and doesn’t rest on any perceptions or behavioral assumptions.

John Cochrane

What Cochrane is reiterating here is nothing but Say’s law, basically saying that savings are equal to investments, and that if the state increases investments, then private investments have to come down (‘crowding out’). As an accounting identity there is of course nothing to say about the law, but as such it is also totally uninteresting from an economic point of view. As some of my Swedish forerunners — Gunnar Myrdal and Erik Lindahl — stressed more than 80 years ago, it’s really a question of ex ante and ex post adjustments. And as further stressed by a famous English economist about the same time, what happens when ex ante savings and investments differ, is that we basically get output adjustments. GDP changes and so makes saving and investments equal ex ost. And this, nota bene, says nothing at all about the success or failure of fiscal policies!

Government borrowing is supposed to “crowd out” private investment.

william-vickrey-1914-1996The current reality is that on the contrary, the expenditure of the borrowed funds (unlike the expenditure of tax revenues) will generate added disposable income, enhance the demand for the products of private industry, and make private investment more profitable. As long as there are plenty of idle resources lying around, and monetary authorities behave sensibly, (instead of trying to counter the supposedly inflationary effect of the deficit) those with a prospect for profitable investment can be enabled to obtain financing. Under these circumstances, each additional dollar of deficit will in the medium long run induce two or more additional dollars of private investment. The capital created is an increment to someone’s wealth and ipso facto someone’s saving. “Supply creates its own demand” fails as soon as some of the income generated by the supply is saved, but investment does create its own saving, and more. Any crowding out that may occur is the result, not of underlying economic reality, but of inappropriate restrictive reactions on the part of a monetary authority in response to the deficit.

William Vickrey Fifteen Fatal Fallacies of Financial Fundamentalism

A couple of years ago, in a lecture on the US recession, Robert Lucas gave an outline of what the new classical school of macroeconomics today thinks on the latest downturns in the US economy and its future prospects.

lucasLucas starts by showing that real US GDP has grown at an average yearly rate of 3 per cent since 1870, with one big dip during the Depression of the 1930s and a big – but smaller – dip in the recent recession.

After stating his view that the US recession that started in 2008 was basically caused by a run for liquidity, Lucas then goes on to discuss the prospect of recovery from where the US economy is today, maintaining that past experience would suggest an “automatic” recovery, if the free market system is left to repair itself to equilibrium unimpeded by social welfare activities of the government.

As could be expected there is no room for any Keynesian type considerations on eventual shortages of aggregate demand discouraging the recovery of the economy. No, as usual in the new classical macroeconomic school’s explanations and prescriptions, the blame game points to the government and its lack of supply side policies.

Lucas is convinced that what might arrest the recovery are higher taxes on the rich, greater government involvement in the medical sector and tougher regulations of the financial sector. But – if left to run its course unimpeded by European type welfare state activities -the free market will fix it all.

In a rather cavalier manner – without a hint of argument or presentation of empirical facts — Lucas dismisses even the possibility of a shortfall of demand. For someone who already 30 years ago proclaimed Keynesianism dead — “people don’t take Keynesian theorizing seriously anymore; the audience starts to whisper and giggle to one another” – this is of course only what could be expected. Demand considerations are simply ruled out on whimsical theoretical-ideological grounds, much like we have seen other neo-liberal economists do over and over again in their attempts to explain away the fact that the latest economic crises shows how the markets have failed to deliver. If there is a problem with the economy, the true cause has to be government.

Chicago economics is a dangerous pseudo-scientific zombie ideology that ultimately relies on the poor having to pay for the mistakes of the rich. Trying to explain business cycles in terms of rational expectations has failed blatantly. Maybe it would be asking too much of freshwater economists like Lucas and Cochrane to concede that, but it’s still a fact that ought to be embarrassing. My rational expectation is that 30 years from now, no one will know who Robert Lucas or John Cochrane was. John Maynard Keynes, on the other hand, will still be known as one of the masters of economics.

shackleIf at some time my skeleton should come to be used by a teacher of osteology to illustrate his lectures, will his students seek to infer my capacities for thinking, feeling, and deciding from a study of my bones? If they do, and any report of their proceedings should reach the Elysian Fields, I shall be much distressed, for they will be using a model which entirely ignores the greater number of relevant variables, and all of the important ones. Yet this is what ‘rational expectations’ does to economics.

G. L. S. Shackle

White Flag

28 May, 2017 at 20:05 | Posted in Varia | Comments Off on White Flag

 

Economic modeling — a realist perspective

28 May, 2017 at 13:37 | Posted in Theory of Science & Methodology | Comments Off on Economic modeling — a realist perspective

411WDSW5BRL._SX331_BO1,204,203,200_To his credit Keynes was not, in contrast to Samuelson, a formalist who was committed to mathematical economics. Keynes wanted models, but for him, building them required ‘ a vigilant observation of the actual working of our system.’ Indeed, ‘to convert a model into a quantitative formula is to destroy its usefulness as an instrument of thought.’ That conclusion can be strongly endorsed!

Modern economics has become increasingly irrelevant to the understanding of the real world. The main reason for this irrelevance is the failure of economists to match their deductive-axiomatic methods with their subject.

In mainstream neoclassical economics internal validity is almost everything and external validity next to nothing. Why anyone should be interested in that kind of theories and models is beyond yours truly’s imagination. As long as mainstream economists do not come up with export licenses for their theories and models to the real world in which we live, they really should not be surprised if people say that this is not science, but autism.

Studying mathematics and logics is interesting and fun. It sharpens the mind. In pure mathematics and logics we do not have to worry about external validity. But economics is not pure mathematics or logics. It’s about society. The real world. Forgetting that, economics is really in danger of becoming — as John Maynard Keynes put it in a letter to Ragnar Frisch in 1935 — “nothing better than a contraption proceeding from premises which are not stated with precision to conclusions which have no clear application.”

The fundamental econometric dilemma

27 May, 2017 at 10:20 | Posted in Statistics & Econometrics | Comments Off on The fundamental econometric dilemma

fraud-kit

Many thanks for sending me your article. I enjoyed it very much. I am sure these matters need discussing in that sort of way. There is one point, to which in practice I attach a great importance, you do not allude to. In many of these statistical researches, in order to get enough observations they have to be scattered over a lengthy period of time; and for a lengthy period of time it very seldom remains true that the environment is sufficiently stable. That is the dilemma of many of these enquiries, which they do not seem to me to face. Either they are dependent on too few observations, or they cannot rely on the stability of the environment. It is only rarely that this dilemma can be avoided.

Letter from J. M. Keynes to T. Koopmans, May 29, 1941

 

Econometric patterns should never be seen as anything else than possible clues to follow. Behind observable data there are real structures and mechanisms operating, things that are  — if we really want to understand, explain and (possibly) predict things in the real world — more important to get hold of than to simply correlate and regress observable variables.

Math cannot establish the truth value of a fact. Never has. Never will.

Paul Romer

Dido

26 May, 2017 at 21:22 | Posted in Varia | Comments Off on Dido

 

Just face it — austerity policies do not work!

26 May, 2017 at 10:29 | Posted in Economics | 1 Comment

 7ti40If failing to understand some basic Keynes­ian relations is a part of the explanation of what happened, there was also another, and more subtle, story behind the confounded economics of austerity. There was an odd confusion in policy thinking between the real need for institutional reform in Europe and the imagined need for austerity – two quite different things …

An analogy can help to make the point clearer: it is as if a person had asked for an antibiotic for his fever, and been given a mixed tablet with antibiotic and rat poison. You cannot have the antibiotic without also having the rat poison. We were in effect being told that if you want economic reform then you must also have, along with it, economic austerity, although there is absolutely no reason whatsoever why the two must be put together as a chemical compound.

Amartya Sen

austerity22We are not going to get out of the present economic doldrums as long as we continue to be obsessed with the insane idea that austerity is the universal medicine. When an economy is already hanging on the ropes, you can’t just cut government spendings. Cutting government expenditures reduces the aggregate demand. Lower aggregate demand means lower tax revenues. Lower tax revenues means increased deficits — and calls for even more austerity. And so on, and so on.

Kid in suit

26 May, 2017 at 10:06 | Posted in Varia | 1 Comment

 

I used to laugh at my kids when they behaved like this when in kindergarten.
But I guess most people expect something else from a president …

I can’t but grieve for a nation that has given us presidents like George Washington, Thomas Jefferson, Abraham Lincoln, and Franklin D. Roosevelt, and now is run by a witless clown. An absolute disgrace.

Expansionary austerity? You gotta be kidding!

25 May, 2017 at 18:19 | Posted in Economics | Comments Off on Expansionary austerity? You gotta be kidding!


[h/t Gabriel Uriarte]

Modern economics — pseudo-science based on FWUTV

25 May, 2017 at 14:39 | Posted in Statistics & Econometrics | Comments Off on Modern economics — pseudo-science based on FWUTV

The use of FWUTV — facts with unknown truth values — is, as Paul Romeer noticed in last year’s perhaps most interesting insider critique of mainstream economics, all to often used in macroeconomic modelling. But there are other parts of ‘modern’ economics than New Classical RBC economics that also have succumbed to this questionable practice:

CnGyMOeWAAEQVaHStatistical significance is not the same as real-world significance — all it offers is an indication of whether you’re seeing an effect where there is none. Even this narrow technical meaning, though, depends on where you set the threshold at which you are willing to discard the ‘null hypothesis’ — that is, in the above case, the possibility that there is no effect. I would argue that there’s no good reason to always set it at 5 percent. Rather, it should depend on what is being studied, and on the risks involved in acting — or failing to act — on the conclusions …

This example illustrates three lessons. First, researchers shouldn’t blindly follow convention in picking an appropriate p-value cutoff. Second, in order to choose the right p-value threshold, they need to know how the threshold affects the probability of a Type II error. Finally, they should consider, as best they can, the costs associated with the two kinds of errors.

Statistics is a powerful tool. But, like any powerful tool, it can’t be used the same way in all situations.

Narayana Kocherlakota

Good lessons indeed — underlining how important it is not to equate science with statistical calculation. All science entail human judgement, and using statistical models doesn’t relieve us of that necessity. Working with misspecified models, the scientific value of significance testing is actually zero – even though you’re making valid statistical inferences! Statistical models and concomitant significance tests are no substitutes for doing science.

In its standard form, a significance test is not the kind of ‘severe test’ that we are looking for in our search for being able to confirm or disconfirm empirical scientific hypotheses. This is problematic for many reasons, one being that there is a strong tendency to accept the null hypothesis since they can’t be rejected at the standard 5% significance level. In their standard form, significance tests bias against new hypotheses by making it hard to disconfirm the null hypothesis.

And as shown over and over again when it is applied, people have a tendency to read “not disconfirmed” as ‘probably confirmed.’ Standard scientific methodology tells us that when there is only say a 10 % probability that pure sampling error could account for the observed difference between the data and the null hypothesis, it would be more ‘reasonable’ to conclude that we have a case of disconfirmation. Especially if we perform many independent tests of our hypothesis and they all give about the same 10 % result as our reported one, I guess most researchers would count the hypothesis as even more disconfirmed.

We should never forget that the underlying parameters we use when performing significance tests are model constructions. Our p-values mean next to nothing if the model is wrong. And most importantly — statistical significance tests DO NOT validate models!

411-y9smopl-_sx346_bo1204203200_In journal articles a typical regression equation will have an intercept and several explanatory variables. The regression output will usually include an F-test, with p – 1 degrees of freedom in the numerator and n – p in the denominator. The null hypothesis will not be stated. The missing null hypothesis is that all the coefficients vanish, except the intercept.

If F is significant, that is often thought to validate the model. Mistake. The F-test takes the model as given. Significance only means this: if the model is right and the coefficients are 0, it is very unlikely to get such a big F-statistic. Logically, there are three possibilities on the table:
i) An unlikely event occurred.
ii) Or the model is right and some of the coefficients differ from 0.
iii) Or the model is wrong.

Yes, indeed. Forgetting — or at least pretending to forget — that third possibility, turns much of ‘modern’ economics and econometrics into post-real blah blah blah pseudo-science.

Financial crises — no big deal

23 May, 2017 at 17:51 | Posted in Economics | 1 Comment

dumstrutMany say or think that there were problems in the financial system that gave rise to the Great Depression. We’ve looked at that in a systematic way using modern theory. And we found that businesses had all kinds of money to invest, and they didn’t. They increased distributions to owners. Why? The answer is that businesses did not perceive they had profitable investment opportunities.

I don’t think financial crises are a big deal.

Edward Prescott

And this blah blah blah guy got a “Nobel prize” …

‘Modern’ economics — blah blah blah

23 May, 2017 at 16:37 | Posted in Statistics & Econometrics | 2 Comments

A key part of the solution to the identification problem that Lucas and Sargent (1979) seemed to offer was that mathematical deduction could pin down some parameters in a simultaneous system. But solving the identification problem means feeding facts with truth values that can be assessed, yet math cannot establish the truth value of a fact. Never has. Never will.

blah_blahIn practice, what math does is let macro-economists locate the FWUTVs [facts with unknown truth values] farther away from the discussion of identification … Relying on a micro-foundation lets an author say, “Assume A, assume B, …  blah blah blah …. And so we have proven that P is true. Then the model is identified.” …

Distributional assumptions about error terms are a good place to bury things because hardly anyone pays attention to them. Moreover, if a critic does see that this is the identifying assumption, how can she win an argument about the true expected value the level of aether? If the author can make up an imaginary variable, “because I say so” seems like a pretty convincing answer to any question about its properties.

Paul Romer

Yes, indeed, modern mainstream economics — and especially its mathematical-statistical operationalization in the form of econometrics — fails miserably over and over again. One reason why it does, is that the error term in the regression models used are thought of as representing the effect of the variables that were omitted from the models. The error term is somehow thought to be a ‘cover-all’ term representing omitted content in the model and necessary to include to ‘save’ the assumed deterministic relation between the other random variables included in the model. Error terms are usually assumed to be orthogonal (uncorrelated) to the explanatory variables. But since they are unobservable, they are also impossible to empirically test. And without justification of the orthogonality assumption, there is as a rule nothing to ensure identifiability.

In mainstream econometrics the error term is usually portrayed as representing the combined effect of the variables that are omitted from the model. What one does not say — in a way bordering on intellectual dishonesty — is that this assumption only works when (1) the combined effect is independent of each and every variable included in the model, and (2) the expectational value of the combined effect equals zero. And that is something almost never fulfilled in real world settings!

‘Modern’ mainstream economics is based on the belief that deductive-axiomatic modelling  is a sufficient guide to truth. That belief is, however, totally unfounded as long as no proofs are supplied for us to believe in the assumptions on which the model-based deductions and conclusions  build. ‘Mathiness’ masquerading as science is often used by mainstream economists to hide the problematic character of the assumptions used in their theories and models. But — without showing the model assumptions to be realistic and relevant, that kind of economics indeed, as Romer puts it, produces nothing but “blah blah blah.”

Building a science of economics for the real world

22 May, 2017 at 16:51 | Posted in Economics | 1 Comment

smell
Following the greatest economic depression since the 1930s, Robert Solow in 2010 gave a prepared statement on “Building a Science of Economics for the Real World” for this hearing in the U. S. Congress. According to Solow modern macroeconomics has not only failed at solving present economic and financial problems, but is “bound” to fail. Building microfounded macromodels on “assuming the economy populated by a representative agent” — consisting of “one single combination worker-owner-consumer-everything-else who plans ahead carefully and lives forever” — do not pass the smell test: does this really make sense? Solow surmised that a thoughtful person “faced with the thought that economic policy was being pursued on this basis, might reasonably wonder what planet he or she is on.”

Conclusion: an economic theory or model that doesn’t pass the real world smell-test is just silly nonsense that doesn’t deserve our attention and therefore belongs in the dustbin.

Rational expectations — not to mention effective market hypotheses, NAIRU, and DSGE models — immediately comes to mind.

Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. As I tried to show in my paper Rational expectations — a fallacious foundation for macroeconomics in a non-ergodic world there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place macroeconomic models building on representative actors and rational expectations-microfoundations where they belong — in the dustbin.

If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

Neoliberalism — an oversold ideology

22 May, 2017 at 14:37 | Posted in Economics | 3 Comments

So what’s wrong with the economy? …

austerity_world_tour_greeceA 2002 study of United States fiscal policy by the economists Olivier Blanchard and Roberto Perotti found that ‘both increases in taxes and increases in government spending have a strong negative effect on private investment spending.’ They noted that this finding is ‘difficult to reconcile with Keynesian theory.’

Consistent with this, a more recent study of international data by the economists Alberto Alesina and Silvia Ardagna found that ‘fiscal stimuli based on tax cuts are more likely to increase growth than those based on spending increases.’

Greg Mankiw

From Mankiw’s perspective ‘the Alesina work suggests a still plausible hypothesis.’

Hmm …

Austerity policies not only generate substantial welfare costs due to supply-side channels, they also hurt demand — and thus worsen employment and unemployment.The notion that fiscal consolidations can be expansionary (that is, raise output and employment), in part by raising private sector confidence and investment, has been championed by, among others, Harvard economist Alberto Alesina in the academic world and by former European Central Bank President Jean-Claude Trichet in the policy arena. austerity-meme-sequester-thisHowever, in practice, episodes of fiscal consolidation have been followed, on average, by drops rather than by expansions in output. On average, a consolidation of 1 percent of GDP increases the long-term unemployment rate by 0.6 percentage point and raises by 1.5 percent within five years the Gini measure of income inequality ….

The evidence of the economic damage from inequality suggests that policymakers should be more open to redistribution than they are. Of course, apart from redistribution, policies could be designed to mitigate some of the impacts in advance—for instance, through increased spending on education and training, which expands equality of opportunity (so-called predistribution policies). And fiscal con- solidation strategies—when they are needed—could be designed to minimize the adverse impact on low-income groups. But in some cases, the untoward distributional consequences will have to be remedied after they occur by using taxes and government spending to redistribute income. Fortunately, the fear that such policies will themselves necessarily hurt growth is unfounded.

Jonathan Ostry, Prakash Loungani, and David Furceri

Economists have a tendency to get enthralled by their theories and models, and forget that behind the figures and abstractions there is a real world with real people. Real people that have to pay dearly for fundamentally flawed doctrines and recommendations.

Let’s make sure the consequences will rest on the conscience of those economists.

Next Page »

Blog at WordPress.com.
Entries and Comments feeds.