Suggestion for Krugman’s reading list

31 Mar, 2012 at 18:42 | Posted in Economics | 2 Comments

As we all know Paul Krugman is very fond of referring to and defending the old and dear IS-LM model.

John Hicks, the man who invented it in his 1937 Econometrica review of Keynes’ General TheoryMr. Keynes and the ‘Classics’. A Suggested Interpretation  – returned to it in an article in 1980 – IS-LM: an explanation –  in Journal of Post Keynesian Economics.  Self-critically he wrote:

I accordingly conclude that the only way in which IS-LM analysis usefully survives — as anything more than a classroom gadget, to be superseded, later on, by something better – is in application to a particular kind of causal analysis, where the use of equilibrium methods, even a drastic use of equilibrium methods, is not inappropriate. I have deliberately interpreted the equilibrium concept, to be used in such analysis, in a very stringent manner (some would say a pedantic manner) not because I want to tell the applied economist, who uses such methods, that he is in fact committing himself to anything which must appear to him to be so ridiculous, but because I want to ask him to try to assure himself that the divergences between reality and the theoretical model, which he is using to explain it, are no more than divergences which he is entitled to overlook. I am quite prepared to believe that there are cases where he is entitled to overlook them. But the issue is one which needs to be faced in each case.

When one turns to questions of policy, looking toward the future instead of the past, the use of equilibrium methods is still more suspect. For one cannot prescribe policy without considering at least the possibility that policy may be changed. There can be no change of policy if everything is to go on as expected-if the economy is to remain in what (however approximately) may be regarded as its existing equilibrium. It may be hoped that, after the change in policy, the economy will somehow, at some time in the future, settle into what may be regarded, in the same sense, as a new equilibrium; but there must necessarily be a stage before that equilibrium is reached …

I have paid no attention, in this article, to another weakness of IS-LM analysis, of which I am fully aware; for it is a weakness which it shares with General Theory itself. It is well known that in later developments of Keynesian theory, the long-term rate of interest (which does figure, excessively, in Keynes’ own presentation and is presumably represented by the r of the diagram) has been taken down a peg from the position it appeared to occupy in Keynes. We now know that it is not enough to think of the rate of interest as the single link between the financial and industrial sectors of the economy; for that really implies that a borrower can borrow as much as he likes at the rate of interest charged, no attention being paid to the security offered. As soon as one attends to questions of security, and to the financial intermediation that arises out of them, it becomes apparent that the dichotomy between the two curves of the IS-LM diagram must not be pressed too hard.

Back in 1937 John Hicks said that he was building a model of John Maynard Keynes’ General Theory. He wasn’t.

What Hicks acknowledges in 1980 is basically that his original review totally ignored the very core of Keynes’ theory – uncertainty. In doing this he actually turned the train of macroeconomics on the wrong tracks for decades. It’s about time that neoclassical economists – as Krugman, Mankiw, or what have you – set the record straight and stop promoting something that the creator himself admits was a total failure. Why not study the real thing itself – General Theory – in full and without looking the other way when it comes to non-ergodicity and uncertainty?

Det är dags att skrota det stabiliseringspolitiska ramverket!

31 Mar, 2012 at 10:47 | Posted in Economics | 5 Comments

Hur ska den ekonomiska politiken bäst föras i en “extrem situation” som den vi har idag?

Många nationalekonomer tycker att vi ska hålla fast vid det rådande stabiliseringspolitiska ramverket.

Men – inte ens den mest utrerade normpolitiken klarar sig utan en betydande grad av flexibilitet. Penning- och finanspolitik måste bedrivas på annorlunda sätt när vi befinner oss i en situation nära ”zero lower bound” och likviditetsfällor gör det svårt att föra en nödig expansiv penningpolitik. Här är det oftast optimalt att göra avvikelser från uppställda inflationsmål och tillämpa undantagsklausuler som ger centralbanker flexibilitet nog att avvika från den normpolitiskt styrda låginflationspolitiken.

Som flera forskare kunnat visa (t ex här) kan multiplikatoreffekterna mycket väl vara > 1 i dylika likviditetsfällor. Att i det läget – som i Sverige – avstå från att bedriva en expansiv finanspolitik och mestadels nöja sig med att genomföra skattesänkningar, är oftast kontraproduktivt, eftersom effekterna i stor utsträckning inskränker sig till att höja realräntan och öka det privata sparandet. I likviditetsfällor ger helt enkelt offentliga utgifter regelmässigt större ”bang for the buck” än skattesänkningar.

Att det i mer normala lägen inte skulle finns någon anledning att överge det normala
finanspolitiska regelverket är en klen tröst, när det visat sig att likviditetsfällor och skulddeflation är återkommande inslag i vår samhällsekonomi.

Vi lever i en ekonomiskt brydsam och omtumlande tid. Då räcker det inte med halvmessyrer och harhjärtade förslag som bara på marginalen ändrar på det stabiliseringspolitiska ramverket – eftersom det rådande ramverket med stor sannolikhet bidrar till att omöjliggöra en adekvat krisbekämpning i en “extrem situation” som den vi lever i idag.

Riksbankens inflationsmål på 2% uppfattas av etablissemangsekonomerna i det här landet som nästintill heligt och gudasänt. Andra av oss ekonomer är mer tveksamma. Paul Krugman och IMF delar – med rätta – inte detta smått religiösa vidhållande av målet på just 2%:

But why is the inflation target only 2 percent?

Actually, I understand why; the inflation hawks are still a powerful force that must be appeased. But the truth is that recent experience has made an overwhelming case for the proposition that the 2 percent or so implicit target prior to the Great Recession was too low, that 4 or 5 percent would be much better. Even the chief economist at the IMF says so …

The thing is, if we’re going to lock in a formal inflation target, now would be a good time to get it right, instead of waiting until the memory of the crisis fades and everyone gets complacent again.

Oxfordprofessorn Simon Wren-Lewis har idag på sin blog ett läsvärt inlägg som ytterligare förstärker bilden av att det börjar bli dags för ekonomkåren att att ta sig ur sin smått religiösa fixering vid i grunden ganska godtyckliga ekonomisk-politiska mål:

Good policy takes account of risks, and what you can do about them. Being at the zero lower bound means that you do not do things that deflate demand unless you believe growth will be strong anyway. If that is what the government believed in 2010 they were foolish indeed …
As I continue to be surprised at the number of very good and sensible economists who seem reluctant to acknowledge that fiscal policy matters for demand when monetary policy is constrained, I fear they might have been led astray in part by (selective) advice received …
However, I am still not sure. As the recession hit, Osborne consistently argued against fiscal stimulus. In April 2009, George Osborne [skuggfinansminister 2005-2010, sedan dess finansminister i Camerons koalitionsregering] gave a speech which included a short history of macroeconomic thought, even including references to the Lucas critique. It ended up with New Keynesian models, and he then said this:
“[New Keynesian] Models of this kind underpin our whole macroeconomic policy framework – in particular the idea that by using monetary policy to manage demand and control inflation you can keep unemployment low and stable. And they underpinned the argument David Cameron and I advanced last autumn – that monetary policy should bear the strain of stimulating demand – an argument echoed by the Governor of the Bank of England last month when he said that “monetary policy should bear the brunt of dealing with the ups and downs of the economy”. We now appear to be winning that argument hands down.”
The previous month, the Bank of England reduced interest rates to 0.5%, where they have remained ever since. So a month after interest rates hit the zero lower bound, Osborne gives a speech which included a perfectly sensible account of macroeconomic policy, except when you hit a zero lower bound. So perhaps they were after all ‘very foolish indeed’.

Att bedriva ekonomisk politik när vi har solsken och vackert väder är en sak. Att bedriva samma typ av ekonomisk normpolitik när väderleksrapporterna talar om stormbyar och mörka moln är verkligen ”very foolish indeed”.

Uncertainty and ergodicity – the important difference between Keynes and Knight

30 Mar, 2012 at 14:34 | Posted in Economics, Statistics & Econometrics, Theory of Science & Methodology | 8 Comments

In the week I’ve had an interesting discussion with Paul Davidson – founder and editor of the Journal of Post Keynesian Economics – on uncertainty and ergodicity, on the Real-World Economics Review Blog. It all started with me commenting on Davidson’s article Is economics a science? Should economics be rigorous? :

LPS:

Davidson’s article is a nice piece – but ergodicity is a difficult concept that many students of economics have problems with understanding. To understand real world ”non-routine” decisions and unforeseeable changes in behaviour, ergodic probability distributions are of no avail. In a world full of genuine uncertainty – where real historical time rules the roost – the probabilities that ruled the past are not those that will rule the future.

Time is what prevents everything from happening at once. To simply assume that economic processes are ergodic and concentrate on ensemble averages – and a fortiori in any relevant sense timeless – is not a sensible way for dealing with the kind of genuine uncertainty that permeates open systems such as economies.

When you assume the economic processes to be ergodic, ensemble and time averages are identical. Let me give an example: Assume we have a market with an asset priced at 100 €. Then imagine the price first goes up by 50% and then later falls by 50%. The ensemble average for this asset would be 100 €- because we here envision two parallel universes (markets) where the asset-price falls in one universe (market) with 50% to 50 €, and in another universe (market) it goes up with 50% to 150 €, giving an average of 100 € ((150+50)/2). The time average for this asset would be 75 € – because we here envision one universe (market) where the asset-price first rises by 50% to 150 €, and then falls by 50% to 75 € (0.5*150).

From the ensemble perspective nothing really, on average, happens. From the time perspective lots of things really, on average, happen.

Assuming ergodicity there would have been no difference at all.

Just in case you think this is just an academic quibble without repercussion to our real lives, let me quote from an article of physicist and mathematician Ole Peters in the Santa Fe Institute Bulletin from 2009 – “On Time and Risk” – that makes it perfectly clear that the flaw in thinking about uncertainty in terms of “rational expectations” and ensemble averages has had real repercussions on the functioning of the financial system:

“In an investment context, the difference between ensemble averages and time averages is often small. It becomes important, however, when risks increase, when correlation hinders diversification, when leverage pumps up fluctuations, when money is made cheap, when capital requirements are relaxed. If reward structures—such as bonuses that reward gains but don’t punish losses, and also certain commission schemes—provide incentives for excessive risk, problems arise. This is especially true if the only limits to risk-taking derive from utility functions that express risk preference, instead of the objective argument of time irreversibility. In other words, using the ensemble average without sufficiently restrictive utility functions will lead to excessive risk-taking and eventual collapse. Sound familiar?”

PD:

Lars, if the stochastic process is ergodic, then for for an infinite realizations, the time and space (ensemble) averages will coincide. An ensemble a is samples drawn at a fixed point of time drawn from a universe of realizations For finite realizations, the time and space statistical averages tend to converge (with a probability of one) the more data one has.

Even in physics there are some processes that physicists recognize are governed by nonergodic stochastic processes. [ see A. M. Yaglom, An Introduction to Stationary Random Functions [1962, Prentice Hall]]

I do object to Ole Peters exposition quote where he talks about “when risks increase”. Nonergodic systems are not about increasing or decreasing risk in the sense of the probability distribution variances differing. It is about indicating that any probability distribution based on past data cannot be reliably used to indicate the probability distribution governing any future outcome. In other words even if (we could know) that the future probability distribution will have a smaller variance (“lower risks”) than the past calculated probability distribution, then the past distribution is not is not a reliable guide to future statistical means and other moments around the means.

LPS:

Paul, re nonergodic processes in physics I would even say that MOST processes definitely are nonergodic. Re Ole Peters I totally agree that what is important with the fact that real social and economic processes are nonergodic is the fact that uncertainty – not risk – rules the roost. That was something both Keynes and Knight basically said in their 1921 books. But I still think that Peters’ discussion is a good example of how thinking about uncertainty in terms of “rational expectations” and “ensemble averages” has had seriously bad repercussions on the financial system.

PD:

Lars, there is a difference between the uncertainty concept developed by Keynes and the one developed by Knight.

As I have pointed out, Keynes’s concept of uncertainty involves a nonergodic stochastic process . On the other hand, Knight’s uncertainty — like Taleb’s black swan — assumes an ergodic process. The difference is the for Knight (and Taleb) the uncertain outcome lies so far out in the tail of the unchanging (over time) probability distribution that it appears empirically to be [in Knight’s terminology] “unique”. In other words, like Taleb’s black swan, the uncertain outcome already exists in the probability distribution but is so rarely observed that it may take several lifetimes for one observation — making that observation “unique”.

In the latest edition of Taleb’s book , he was forced to concede that philosophically there is a difference between a nonergodic system and a black swan ergodic system –but then waves away the problem with the claim that the difference is irrelevent.

LPS:

Paul, on the whole, I think you’re absolutely right on this. Knight’s uncertainty concept has an epistemological founding and Keynes’s definitely an ontological founding. Of course this also has repercussions on the issue of ergodicity in a strict methodological and mathematical-statistical sense. I think Keynes’s view is the most warranted of the two.

BUT – from a “practical” point of view I have to agree with Taleb. Because if there is no reliable information on the future, whether you talk of epistemological or ontological uncertainty, you can’t calculate probabilities.

The most interesting and far-reaching difference between the epistemological and the ontological view is that if you subscribe to the former, knightian view – as Taleb and “black swan” theorists basically do – you open up for the mistaken belief that with better information and greater computer-power we somehow should always be able to calculate probabilities and describe the world as an ergodic universe. As both you and Keynes convincingly have argued, that is ontologically just not possible.

PD:

Lars, your last sentence says it all. If you believe it is an ergodic system and epistemology is the only problem, then you should urge more transparency , better data collection, hiring more “quants” on Wall Street to generate “better” risk management computer problems, etc — and above all keep the government out of regulating financial markets — since all the government can do is foul up the outcome that the ergodic process is ready to deliver.

Long live Stiglitz and the call for transparency to end asymmetric information — and permit all to know the epistemological solution for the ergodic process controlling the economy.

Or as Milton Friedman would say, those who make decisions “as if” they knew the ergodic stochastic process create an optimum market solution — while those who make mistakes in trying to figure out the ergodic process are like the dinosaurs, doomed to fail and die off — leaving only the survival of the fittest for a free market economy to prosper on. The proof is why all those 1% far cats CEO managers in the banking business receive such large salaries for their “correct” decisions involving financial assets.

Alternatively, if the financial and economic system is non ergodic then there is a positive role for government to regulate what decision makers can do so as to prevent them from mass destruction of themselves and other innocent bystanders — and also for government to take positive action when the herd behavior of decision makers are causing the economy to run off the cliff.

So this distinction between ergodic and nonergodic is essential if we are to build institutional structures that make running off the cliff almost impossible. — and for the government to be ready to take action when some innovative fool(s) discovers a way to get around institutional barriers and starts to run the economy off the cliff.

To Keynes the source of uncertainty was in the nature of the real – nonergodic – world. It had to do, not only – or primarily – with the epistemological fact of us not knowing the things that today are unknown, but rather with the much deeper and far-reaching ontological fact that there often is no firm basis on which we can form quantifiable probabilites and expectations.

Has “modern” macroeconomics delivered?

29 Mar, 2012 at 14:37 | Posted in Economics | Comments Off on Has “modern” macroeconomics delivered?

Jonathan Schlefer, research associate at Harvard Business School, has written a new book – The Assumptions Economists Make – on what “modern” macroeconomics has delivered during the last few decades. Although Schlefer shares Milton Friedman’s instrumentalist view on theories and models, he can’t really see that they have delivered what they promised – useful predictions. Justin Fox summarizes:

By that standard, here are Schlefer’s judgments on the succession of theories that have dominated academic macroeconomics since the 1970s:

Rational expectations (which proposed that we’re all too smart to be fooled by money-printing central bankers and deficit-spending governments): Intellectually interesting, and maybe helpful in “normal times,” whatever those are. But not very good at describing or predicting the actual behavior of the economy at any time, and worthless in a crisis.

Real business-cycle theory (which says that economic ups and downs are all caused by technology-induced changes in productivity): “[N]ot only are these models a tautology — they are a tautology that turns out to be wrong. They say that employment rises or falls because actors choose to work more when productivity is high and less when it’s low. This is nuts.”

DSGE (sometimes called “New Keynesian”) models: Not “quite as bad as they sound,” as they do describe an economy that moves along by fits and starts. They just don’t leave room for any crazy stuff.

Keen answers Krugman

29 Mar, 2012 at 11:42 | Posted in Economics | 4 Comments

Steve Keen has answered Krugman.
To the point.
Brief.
Read it!

Neoclassical economics – visionless and escaping responsibility

28 Mar, 2012 at 23:29 | Posted in Economics, Theory of Science & Methodology | Comments Off on Neoclassical economics – visionless and escaping responsibility

A while ago Roger Backhouse and Bradley Bateman had a nice piece in New York Times on the lack of perspectives and alternatives shown by mainstream economists when dealing with the systemic crises of modern economies:

Economists do much better when they tackle small, well-defined problems. As John Maynard Keynes put it, economists should become more like dentists: modest people who look at a small part of the body but remove a lot of pain.

However, there are also downsides to approaching economics as a dentist would: above all, the loss of any vision about what the economic system should look like. Even Keynes himself was driven by a powerful vision of capitalism. He believed it was the only system that could create prosperity, but it was also inherently unstable and so in need of constant reform. This vision caught the imagination of a generation that had experienced the Great Depression and World War II and helped drive policy for nearly half a century …

In the 20th century, the main challenge to Keynes’s vision came from economists like Friedrich Hayek and Milton Friedman, who envisioned an ideal economy involving isolated individuals bargaining with one another in free markets. Government, they contended, usually messes things up. Overtaking a Keynesianism that many found inadequate to the task of tackling the stagflation of the 1970s, this vision fueled neoliberal and free-market conservative agendas of governments around the world.

THAT vision has in turn been undermined by the current crisis. It took extensive government action to prevent another Great Depression, while the enormous rewards received by bankers at the heart of the meltdown have led many to ask whether unfettered capitalism produced an equitable distribution of wealth. We clearly need a new, alternative vision of capitalism. But thanks to decades of academic training in the “dentistry” approach to economics, today’s Keynes or Friedman is nowhere to be found.

And now Philip Mirowski has an equally interesting article on opendemocracy.net explaining why neoclassical economists

don’t seem to have suffered one whit for the subsequent sequence of events, a slow-motion train wreck that one might reasonably have expected would have rubbished the credibility of lesser mortals.

Read it!

Centerpartiet och lärarlönerna

27 Mar, 2012 at 10:41 | Posted in Economics, Education & School | Comments Off on Centerpartiet och lärarlönerna

Det nyliberala centerpartiet – med en devot Margaret Thatcher och Ayn Rand beundrande Annie Lööf i spetsen – har föreslagit att ingångslönerna skall sänkas för att skapa jobb. Som jag skrivit om tidigare är detta en helt galen idé, med liten eller ingen förankring i ekonomisk klokskap.

Men man kan också undra över hur det står till med partiets ideologiska renlärighet. Centern hävdar nämligen också att staten ska ha en aktiv roll i lönebildningsprocessen. Det låter inte så varken liberalt eller nyliberalt. Och varför anammar inte partiet samma inställning när det gäller skolan och lärarlönerna? Hade det inte varit konsekvent att också kräva statlig styrning av lärarlönerna? Öronmärkning av statliga bidrag till kommunerna (så att pengar villkoras till att också innebära lärarlönehöjningar)? Återförstatligande av skolan? Jag bara undrar.

Sänkta löner löser ingenting

24 Mar, 2012 at 14:53 | Posted in Economics | 14 Comments

För varje månad som går vinner konjunkturnedgången i Sverige i oavbruten styrka. De närmsta åren hotar bjuda på en långvarig konjunkturnedgång i den svenska ekonomin.

I detta minst sagt svåra läge – när ekonomin åter börjar hänga på repen – har vi de senaste veckorna kunnat läsa hur man från både regerings- och arbetsgivare-håll föreslagit att man ska försöka lösa krisen med hjälp av lönesänkningar i en eller annan form (inte minst ungdomslönerna har varit på tapeten).

Egentligen ska man nog främst se detta som ett tecken på hur lågt förtroendet för det ekonomiska systemet sjunkit. Från gårdagens nyliberala våtdrömmar om ekonomins kommandohöjder har vi sjunkit till den ekonomiska verklighetens bankkrascher, företagsnedläggelser och galopperande arbetslöshet. För det är ju självklart inte så att lönesänkningar räddar jobb. I en situation där krisen som började kring 2007-08 långt ifrån är över, varken globalt eller här hemma, har vi mer än något annat behov av stimulansåtgärder och en ekonomisk politik som leder till ökad effektiv efterfrågan.

På samhällsnivå ökar lönesänkningar bara risken för att ännu fler kommer att stå utan jobb. Att tro sig kunna lösa kriser på detta sätt är en tillbakagång till den förfelade ekonomiska teori och politik som John Maynard Keynes slutgiltigt gjorde upp med redan på 1930-talet. Det var en politik som gjorde miljontals människor världen över arbetslösa. Trettiotalsdepressionen åtföljdes av deflation som visserligen kunde innebära höjda reallöner – men bara för dem som lyckades behålla sina jobb.

Visst kan det kortsiktigt fungera för enskilda företag och fackföreningar att satsa på frysta eller sänkta löner. Men det är ett atomistiskt felslut att tro att en generell lönesänkningspolitik skulle främja ekonomin. Tvärtom. Som Keynes visade blir den aggregerade effekten av lönesänkningar katastrofal. De sätter igång en kumulativ prissänknings-spiral som får företags och enskildas skulder att realt öka eftersom skulderna nominellt inte påverkas av den allmänna pris- och löneutvecklingen. I en ekonomi som alltmer kommit att bygga på ökat låntagande och skuldsättning blir detta inkörsporten till en deflationskris. Detta i sin tur leder till att ingen vill låna pengar och kapital eftersom betalningsbördan över tiden blir för betungande. Företag blir insolventa, investeringarna minskar, arbetslösheten ökar och depressionen står för dörren.

Låt oss tala. Och tala tydligt. Risken för skulddeflation kan aldrig bagatelliseras. Ser vi på svenska data över inflationsförändring och BNP för åren 1980-2008 kan vi konstatera att för varje punkt som den reala BNP understiger den potentiella BNP så tenderar inflationstakten att minska med en halv punkt. Detta måste tas på fullt allvar. Om gapet mellan real och potentiell BNP ökar de närmsta åren innebär detta att deflationen snart riskerar bli ett faktum.

Det överhängande problemet för svensk ekonomi är att vi inte får fart på konsumtion och kreditgivning. Förtroende och effektiv efterfrågan måste återupprättas. Vi kan självklart inte bara sitta med armarna i kors och vänta på att ovädret drar vidare – men att föreslå krislösning byggd på sänkta löner är att skriva ut recept på än värre katastrofer. Vill regering och arbetsgivare ta samhällsansvar – och inte låta enfald få ersätta analysförmåga och besinning – måste de klart och tydligt ta avstånd från allehanda kortsiktiga och kontraproduktiva lönesänkningsstrategier. Skall man ta sig ur djupa ekonomiska svackor krävs vassare, bättre och effektivare instrument – som exempelvis en aktiv och offensiv finanspolitik.

Austerity in depressed economies

23 Mar, 2012 at 11:44 | Posted in Economics | Comments Off on Austerity in depressed economies

Paul Krugman has a niece piece today on his blog, commenting on the recently published paper by Brad DeLong and Larry Summers on fiscal policy in a depressed economy.
Even though the long-run effects of austerity policies may be minimal, they certainly impose large costs here and now. Self-evident conclusion: stop cutting and start stimulate, at least as long as we are stilled trapped in a liquidity trap.

I’ve been posting various versions of a scatterplot showing the relationship between one indicator of fiscal policy and growth since the crisis began. Here’s a version restricted to eurozone countries and countries maintaining a fixed exchange rate against the euro, with many of the countries labeled:

(All data from Eurostat).

Still, is this the kind of outcome you would have expected if you believed what the Austerians were saying? Or is it what you would have expected if you’d been reading those of us horrified by the turn to austerity?

What gets me about all of this is the incredible, unwarrented arrogance of the austerians. They decided that they knew better than textbook macroeconomics, even though none of them had predicted the crisis or even seen the possibility of such a crisis.

And the wreckage now lies all around us.

New-Keynesian macroeconomics and involuntary unemployment

21 Mar, 2012 at 16:25 | Posted in Economics | 7 Comments

People calling themselves “new-Keynesians” – a gross misnomer – ought to be rather embarrassed by the fact that the kind of microfounded dynamic stochastic general equilibrium models they use, cannot incorporate such a basic fact of reality as involuntary unemployment!

Of course, working with representative agent models, this should come as no surprise. If one representative agent is employed, all representative agents are. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility.

Being a “new-Keynesian” it ought to be of interest to know what Keynes had to say on the issue. In General Theory (1937, chapter 2), he writes: 

The classical school [maintains that] while the demand for labour at the existing money-wage may be satisfied before everyone willing to work at this wage is employed, this situation is due to an open or tacit agreement amongst workers not to work for less, and that if labour as a whole would agree to a reduction of money-wages more employment would be forthcoming. If this is the case, such unemployment, though apparently involuntary, is not strictly so, and ought to be included under the above category of ‘voluntary’ unemployment due to the effects of collective bargaining, etc …

The classical theory … is best regarded as a theory of distribution in conditions of full employment. So long as the classical postulates hold good, unemployment, which is in the above sense involuntary, cannot occur. Apparent unemployment must, therefore, be the result either of temporary loss of work of the ‘between jobs’ type or of intermittent demand for highly specialised resources or of the effect of a trade union ‘closed shop’ on the employment of free labour. Thus writers in the classical tradition, overlooking the special assumption underlying their theory, have been driven inevitably to the conclusion, perfectly logical on their assumption, that apparent unemployment (apart from the admitted exceptions) must be due at bottom to a refusal by the unemployed factors to accept a reward which corresponds to their marginal productivity …

Obviously, however, if the classical theory is only applicable to the case of full employment, it is fallacious to apply it to the problems of involuntary unemployment – if there be such a thing (and who will deny it?). The classical theorists resemble Euclidean geometers in a non-Euclidean world who, discovering that in experience straight lines apparently parallel often meet, rebuke the lines for not keeping straight – as the only remedy for the unfortunate collisions which are occurring. Yet, in truth, there is no remedy except to throw over the axiom of parallels and to work out a non-Euclidean geometry. Something similar is required to-day in economics. We need to throw over the second postulate of the classical doctrine and to work out the behaviour of a system in which involuntary unemployment in the strict sense is possible.

The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than “hand waving” that give us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

 

Economics and Reality

19 Mar, 2012 at 20:23 | Posted in Economics, Theory of Science & Methodology | Comments Off on Economics and Reality

“Modern” economics has become increasingly irrelevant to the understanding of the real world. In his seminal book Economics and Reality (1997) Tony Lawson traced this irrelevance to the failure of economists to match their deductive-axiomatic methods with their subject.
It is – sad to say – as relevant today as it was fifteen years ago.
It is still a fact that within mainstream economics internal validity is everything and external validity – nothing. Why anyone should be interested in that kind of theories and models is beyond my imagination. As long as mainstream economists do not come up with any export-licenses for their theories and models to the real world in which we live, they really should not be surprised if people say that this is not science, but autism!
Studying mathematics and logics is interesting and fun. It sharpens the mind. In pure mathematics and logics we do not have to worry about external validity. But economics is not pure mathematics or logics. It’s about society. The real world. Forgetting that, economics is really in dire straits.

Economics and Reality was a great inspiration to me fifteen years ago. It still is.

Dansklektion (II)

18 Mar, 2012 at 19:44 | Posted in Varia | Comments Off on Dansklektion (II)

“Modern” macroeconomics and uncertainty

17 Mar, 2012 at 16:26 | Posted in Economics, Statistics & Econometrics, Theory of Science & Methodology | 1 Comment

The financial crisis of 2007-08 hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even make it conceivable?

There are many who have ventured to answer this question. And they have come up with a variety of answers, ranging from the exaggerated mathematization of economics, to irrational and corrupt politicians. 

But the root of our problem goes much deeper. It ultimately goes back to how we look upon the data we are handling. In “modern” macroeconomics – dynamic stochastic general equilibrium, new synthesis, new-classical and new-Keynesian -variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the “data-generating process” – if we do not have the “true” model – the whole edifice collapses. And of course it has to. I mean, who really honestly believes that we should have access to this mythical Holy Grail, the data-generating process?

“Modern” macroeconomics obviously did not anticipate the enormity of the problems that unregulated “efficient” financial markets created. Why? Because it builds on the myth of us knowing the “data-generating process” and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30%, and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But as Keynes convincingly argued in his monumental Treatise on Probability (1921), this is no always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally  good – model the chance is perhaps somewhere around 40%.  We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

In the end this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only “rational” one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works. 

Some macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better – how much bigger chance that we do not lull us into the comforting thought that we know everything  and that everything is measurable and we have everything under control – if instead we could just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing – economic catastrophy!

The state of microfoundations and macroeconomics – Robert Solow says it all

16 Mar, 2012 at 15:15 | Posted in Economics, Theory of Science & Methodology | 3 Comments

The purported strength of new-classical and new-Keynesian macroeconomics is that they have firm anchorage in preference-based microeconomics, and especially the decisions taken by inter-temporal utility maximizing “forward-loooking” individuals.

To some of us, however, this has come at too high a price. The almost quasi-religious insistence that macroeconomics has to have microfoundations – without ever presenting neither ontological nor epistemological justifications for this claim – has put a blind eye to the weakness of the whole enterprise of trying to depict a complex economy based on an all-embracing representative actor equipped with superhuman knowledge, forecasting abilities and forward-looking rational expectations. It is as if – after having swallowed the sour grapes of the Sonnenschein-Mantel-Debreu-theorem – these economists want to resurrect the omniscient walrasian auctioneer in the form of all-knowing representative actors equipped with rational expectations and assumed to somehow know the true structure of our model of the world (how that could even be conceivable is beyond my imagination, given that the ongoing debate on microfoundations, if anything, shows that not even we, the economists, can come to agreement on a common model).

Following the greatest economic depression since the 1930s, the grand old man of modern economic growth theory, Nobel laureate Robert Solow, on July 20, 2010, gave a prepared statement on “Building a Science of Economics for the Real World” for a hearing in the U. S. Congress. According to Solow modern macroeconomics has not only failed at solving present economic and financial problems, but is “bound” to fail. Building dynamically stochastic general equilibrium models (DSGE) on “assuming the economy populated by a representative agent” – consisting of “one single combination worker-owner-consumer-everything-else who plans ahead carefully and lives forever” – do not pass “the smell test: does this really make sense?” One cannot but concur in Solow’s surmise that a thoughtful person “faced with the thought that economic policy was being pursued on this basis, might reasonably wonder what planet he or she is on.”

Already in 2008 Solow had – in “The State of Macroeconomics” (Journal of Economic Perspectives 2008:243-249) – told us of what he thought of microfounded “modern macro”:

[When modern macroeconomists] speak of macroeconomics as being firmly grounded in economic theory, we know what they mean … They mean a macroeconomics that is deduced from a model in which a single immortal consumer-worker-owner maximizes a perfectly conventional time-additive utility function over an infinite horizon, under perfect foresight or rational expectations, and in an institutional and technological environment that favors universal price-taking behavior …

No one would be driven to accept this story because of its obvious “rightness”. After all, a modern economy is populated by consumers, workers, pensioners, owners, managers, investors, entrepreneurs, bankers, and others, with different and sometimes conflicting desires, information, expectations, capacities, beliefs, and rules of behavior … To ignore all this in principle does not seem to qualify as mere abstraction – that is setting aside inessential details. It seems more like the arbitrary suppression of clues merely because they are inconvenient for cherished preconceptions …

Friends have reminded me that much effort of ‘modern macro’ goes into the incorporation of important deviations from the Panglossian assumptions … [But] a story loses legitimacy and credibility when it is spliced to a simple, extreme, and on the face of it, irrelevant special case. This is the core of my objection: adding some realistic frictions does not make it any more plausible than an observed economy is acting out the desires of a single, consistent, forward-looking intelligence …

It seems to me, therefore, that the claim that ‘modern macro’ somehow has the special virtue of following the principles of economic theory is tendentious and misleading … The other possible defense of modern macro is that, however special it may seem, it is justified empirically. This strikes me as a delusion …

So I am left with a puzzle, or even a challenge. What accounts for the ability of ‘modern macro’ to win hearts and minds among bright and enterprising academic economists? … There has always been a purist streak in economics that wants everything to follow neatly from greed, rationality, and equilibrium, with no ifs, ands, or buts … The theory is neat, learnable, not terribly difficult, but just technical enough to feel like ‘science’. Moreover it is practically guaranteed to give laissez-faire-type advice, which happens to fit nicely with the general turn to the political right that began in the 1970s and may or may not be coming to an end.

Earlier this week, new-Keynesian macroeconomist Simon Wren-Lewis asked me to explain why other ways of doing macro have (purportedly) died out and the microfoundations approach has become so dominant. In my admittedly tentative answer I wrote.

(1) One could of course say that one reason why the microfundations approach is so dominant is – as Krugman has it on his blog today – “trying to embed your ideas in a microfounded model can be a very useful exercise — not because the microfounded model is right, or even better than an ad hoc model, but because it forces you to think harder about your assumptions, and sometimes leads to clearer thinking”.But I don’t really believe that is an especially important reason on the whole. I mean, if people put that enormous amount of time and energy that they do into constructing macroeconomic models, then they really have to be substantially contributing to our understanding and ability to explain and grasp real macroeconomic processes. If not, they should – after somehow perhaps being able to sharpen our thoughts – be thrown into the waste-paper-basket (something the father of macroeconomics, Keynes, used to do), and not as today, being allowed to overrun our economics journals and giving their authors lots of academic prestige.

(2) A more plausible reason is that microfoundations is in line with the reductionism inherent in the metohodological individaulism that almost all neoclassical economists subscribe to. And as e.g. argued by Johan Åkerman and Ekkehart Schlicht this is deeeply problematic for a macroeconomics trying to solve the “summation problem” without nullifying the possibility of emergence.

(3)It is thought to give macroeconomists the means to fully predetermine their models and come up with definitive, robust, stable, answers. In reality we know that the forecasts and expectations of individuals often differ systematically from what materialize in the aggregate, since knowledge is imperfect and uncertainty – rather than risk – rules the roost.

(4) Microfoundations allegedly goes around the Lucas critique by focussing on “deep” structural, invariant parameters of optimizing individuals’ preferences and tastes. As I have argued, this is an empty hope without solid empirical or methodological foundation.

The kind of microfoundations that “new-Keynesian” and new-classical general equilibrium macroeconomists are basing their models on, are not – at least from a realist point of view – plausible.

As all students of economics know, time is limited. Given that, there has to be better ways to optimize its utilization than spending hours and hours working through or constructing irrelevant macroeconomic models founded on microfoundations more chosen from considerations of mathematical tractability than applying to reality. I would rather recommend my students allocating their time into constructing better, real and relevant macroeconomic models – models that really help us to explain and understand reality.

Of course, I could just as well have directed Wren-Lewis to Robert Solow’s article. There the answer to his question was given already four years ago.

Ask the mountains

15 Mar, 2012 at 19:24 | Posted in Varia | Comments Off on Ask the mountains
Next Page »

Blog at WordPress.com.
Entries and Comments feeds.