Suggestion for Krugman’s reading list

31 March, 2012 at 18:42 | Posted in Economics | 2 Comments

As we all know Paul Krugman is very fond of referring to and defending the old and dear IS-LM model.

John Hicks, the man who invented it in his 1937 Econometrica review of Keynes’ General TheoryMr. Keynes and the ‘Classics’. A Suggested Interpretation  – returned to it in an article in 1980 – IS-LM: an explanation –  in Journal of Post Keynesian Economics.  Self-critically he wrote:

I accordingly conclude that the only way in which IS-LM analysis usefully survives — as anything more than a classroom gadget, to be superseded, later on, by something better – is in application to a particular kind of causal analysis, where the use of equilibrium methods, even a drastic use of equilibrium methods, is not inappropriate. I have deliberately interpreted the equilibrium concept, to be used in such analysis, in a very stringent manner (some would say a pedantic manner) not because I want to tell the applied economist, who uses such methods, that he is in fact committing himself to anything which must appear to him to be so ridiculous, but because I want to ask him to try to assure himself that the divergences between reality and the theoretical model, which he is using to explain it, are no more than divergences which he is entitled to overlook. I am quite prepared to believe that there are cases where he is entitled to overlook them. But the issue is one which needs to be faced in each case.

When one turns to questions of policy, looking toward the future instead of the past, the use of equilibrium methods is still more suspect. For one cannot prescribe policy without considering at least the possibility that policy may be changed. There can be no change of policy if everything is to go on as expected-if the economy is to remain in what (however approximately) may be regarded as its existing equilibrium. It may be hoped that, after the change in policy, the economy will somehow, at some time in the future, settle into what may be regarded, in the same sense, as a new equilibrium; but there must necessarily be a stage before that equilibrium is reached …

I have paid no attention, in this article, to another weakness of IS-LM analysis, of which I am fully aware; for it is a weakness which it shares with General Theory itself. It is well known that in later developments of Keynesian theory, the long-term rate of interest (which does figure, excessively, in Keynes’ own presentation and is presumably represented by the r of the diagram) has been taken down a peg from the position it appeared to occupy in Keynes. We now know that it is not enough to think of the rate of interest as the single link between the financial and industrial sectors of the economy; for that really implies that a borrower can borrow as much as he likes at the rate of interest charged, no attention being paid to the security offered. As soon as one attends to questions of security, and to the financial intermediation that arises out of them, it becomes apparent that the dichotomy between the two curves of the IS-LM diagram must not be pressed too hard.

Back in 1937 John Hicks said that he was building a model of John Maynard Keynes’ General Theory. He wasn’t.

What Hicks acknowledges in 1980 is basically that his original review totally ignored the very core of Keynes’ theory – uncertainty. In doing this he actually turned the train of macroeconomics on the wrong tracks for decades. It’s about time that neoclassical economists – as Krugman, Mankiw, or what have you – set the record straight and stop promoting something that the creator himself admits was a total failure. Why not study the real thing itself – General Theory – in full and without looking the other way when it comes to non-ergodicity and uncertainty?

Advertisements

Det är dags att skrota det stabiliseringspolitiska ramverket!

31 March, 2012 at 10:47 | Posted in Economics | 5 Comments

Hur ska den ekonomiska politiken bäst föras i en “extrem situation” som den vi har idag?

Många nationalekonomer tycker att vi ska hålla fast vid det rådande stabiliseringspolitiska ramverket.

Men – inte ens den mest utrerade normpolitiken klarar sig utan en betydande grad av flexibilitet. Penning- och finanspolitik måste bedrivas på annorlunda sätt när vi befinner oss i en situation nära ”zero lower bound” och likviditetsfällor gör det svårt att föra en nödig expansiv penningpolitik. Här är det oftast optimalt att göra avvikelser från uppställda inflationsmål och tillämpa undantagsklausuler som ger centralbanker flexibilitet nog att avvika från den normpolitiskt styrda låginflationspolitiken.

Som flera forskare kunnat visa (t ex här) kan multiplikatoreffekterna mycket väl vara > 1 i dylika likviditetsfällor. Att i det läget – som i Sverige – avstå från att bedriva en expansiv finanspolitik och mestadels nöja sig med att genomföra skattesänkningar, är oftast kontraproduktivt, eftersom effekterna i stor utsträckning inskränker sig till att höja realräntan och öka det privata sparandet. I likviditetsfällor ger helt enkelt offentliga utgifter regelmässigt större ”bang for the buck” än skattesänkningar.

Att det i mer normala lägen inte skulle finns någon anledning att överge det normala
finanspolitiska regelverket är en klen tröst, när det visat sig att likviditetsfällor och skulddeflation är återkommande inslag i vår samhällsekonomi.

Vi lever i en ekonomiskt brydsam och omtumlande tid. Då räcker det inte med halvmessyrer och harhjärtade förslag som bara på marginalen ändrar på det stabiliseringspolitiska ramverket – eftersom det rådande ramverket med stor sannolikhet bidrar till att omöjliggöra en adekvat krisbekämpning i en “extrem situation” som den vi lever i idag.

Riksbankens inflationsmål på 2% uppfattas av etablissemangsekonomerna i det här landet som nästintill heligt och gudasänt. Andra av oss ekonomer är mer tveksamma. Paul Krugman och IMF delar – med rätta – inte detta smått religiösa vidhållande av målet på just 2%:

But why is the inflation target only 2 percent?

Actually, I understand why; the inflation hawks are still a powerful force that must be appeased. But the truth is that recent experience has made an overwhelming case for the proposition that the 2 percent or so implicit target prior to the Great Recession was too low, that 4 or 5 percent would be much better. Even the chief economist at the IMF says so …

The thing is, if we’re going to lock in a formal inflation target, now would be a good time to get it right, instead of waiting until the memory of the crisis fades and everyone gets complacent again.

Oxfordprofessorn Simon Wren-Lewis har idag på sin blog ett läsvärt inlägg som ytterligare förstärker bilden av att det börjar bli dags för ekonomkåren att att ta sig ur sin smått religiösa fixering vid i grunden ganska godtyckliga ekonomisk-politiska mål:

Good policy takes account of risks, and what you can do about them. Being at the zero lower bound means that you do not do things that deflate demand unless you believe growth will be strong anyway. If that is what the government believed in 2010 they were foolish indeed …
As I continue to be surprised at the number of very good and sensible economists who seem reluctant to acknowledge that fiscal policy matters for demand when monetary policy is constrained, I fear they might have been led astray in part by (selective) advice received …
However, I am still not sure. As the recession hit, Osborne consistently argued against fiscal stimulus. In April 2009, George Osborne [skuggfinansminister 2005-2010, sedan dess finansminister i Camerons koalitionsregering] gave a speech which included a short history of macroeconomic thought, even including references to the Lucas critique. It ended up with New Keynesian models, and he then said this:
“[New Keynesian] Models of this kind underpin our whole macroeconomic policy framework – in particular the idea that by using monetary policy to manage demand and control inflation you can keep unemployment low and stable. And they underpinned the argument David Cameron and I advanced last autumn – that monetary policy should bear the strain of stimulating demand – an argument echoed by the Governor of the Bank of England last month when he said that “monetary policy should bear the brunt of dealing with the ups and downs of the economy”. We now appear to be winning that argument hands down.”
The previous month, the Bank of England reduced interest rates to 0.5%, where they have remained ever since. So a month after interest rates hit the zero lower bound, Osborne gives a speech which included a perfectly sensible account of macroeconomic policy, except when you hit a zero lower bound. So perhaps they were after all ‘very foolish indeed’.

Att bedriva ekonomisk politik när vi har solsken och vackert väder är en sak. Att bedriva samma typ av ekonomisk normpolitik när väderleksrapporterna talar om stormbyar och mörka moln är verkligen ”very foolish indeed”.

Uncertainty and ergodicity – the important difference between Keynes and Knight

30 March, 2012 at 14:34 | Posted in Economics, Statistics & Econometrics, Theory of Science & Methodology | 8 Comments

In the week I’ve had an interesting discussion with Paul Davidson – founder and editor of the Journal of Post Keynesian Economics – on uncertainty and ergodicity, on the Real-World Economics Review Blog. It all started with me commenting on Davidson’s article Is economics a science? Should economics be rigorous? :

LPS:

Davidson’s article is a nice piece – but ergodicity is a difficult concept that many students of economics have problems with understanding. To understand real world ”non-routine” decisions and unforeseeable changes in behaviour, ergodic probability distributions are of no avail. In a world full of genuine uncertainty – where real historical time rules the roost – the probabilities that ruled the past are not those that will rule the future.

Time is what prevents everything from happening at once. To simply assume that economic processes are ergodic and concentrate on ensemble averages – and a fortiori in any relevant sense timeless – is not a sensible way for dealing with the kind of genuine uncertainty that permeates open systems such as economies.

When you assume the economic processes to be ergodic, ensemble and time averages are identical. Let me give an example: Assume we have a market with an asset priced at 100 €. Then imagine the price first goes up by 50% and then later falls by 50%. The ensemble average for this asset would be 100 €- because we here envision two parallel universes (markets) where the asset-price falls in one universe (market) with 50% to 50 €, and in another universe (market) it goes up with 50% to 150 €, giving an average of 100 € ((150+50)/2). The time average for this asset would be 75 € – because we here envision one universe (market) where the asset-price first rises by 50% to 150 €, and then falls by 50% to 75 € (0.5*150).

From the ensemble perspective nothing really, on average, happens. From the time perspective lots of things really, on average, happen.

Assuming ergodicity there would have been no difference at all.

Just in case you think this is just an academic quibble without repercussion to our real lives, let me quote from an article of physicist and mathematician Ole Peters in the Santa Fe Institute Bulletin from 2009 – “On Time and Risk” – that makes it perfectly clear that the flaw in thinking about uncertainty in terms of “rational expectations” and ensemble averages has had real repercussions on the functioning of the financial system:

“In an investment context, the difference between ensemble averages and time averages is often small. It becomes important, however, when risks increase, when correlation hinders diversification, when leverage pumps up fluctuations, when money is made cheap, when capital requirements are relaxed. If reward structures—such as bonuses that reward gains but don’t punish losses, and also certain commission schemes—provide incentives for excessive risk, problems arise. This is especially true if the only limits to risk-taking derive from utility functions that express risk preference, instead of the objective argument of time irreversibility. In other words, using the ensemble average without sufficiently restrictive utility functions will lead to excessive risk-taking and eventual collapse. Sound familiar?”

PD:

Lars, if the stochastic process is ergodic, then for for an infinite realizations, the time and space (ensemble) averages will coincide. An ensemble a is samples drawn at a fixed point of time drawn from a universe of realizations For finite realizations, the time and space statistical averages tend to converge (with a probability of one) the more data one has.

Even in physics there are some processes that physicists recognize are governed by nonergodic stochastic processes. [ see A. M. Yaglom, An Introduction to Stationary Random Functions [1962, Prentice Hall]]

I do object to Ole Peters exposition quote where he talks about “when risks increase”. Nonergodic systems are not about increasing or decreasing risk in the sense of the probability distribution variances differing. It is about indicating that any probability distribution based on past data cannot be reliably used to indicate the probability distribution governing any future outcome. In other words even if (we could know) that the future probability distribution will have a smaller variance (“lower risks”) than the past calculated probability distribution, then the past distribution is not is not a reliable guide to future statistical means and other moments around the means.

LPS:

Paul, re nonergodic processes in physics I would even say that MOST processes definitely are nonergodic. Re Ole Peters I totally agree that what is important with the fact that real social and economic processes are nonergodic is the fact that uncertainty – not risk – rules the roost. That was something both Keynes and Knight basically said in their 1921 books. But I still think that Peters’ discussion is a good example of how thinking about uncertainty in terms of “rational expectations” and “ensemble averages” has had seriously bad repercussions on the financial system.

PD:

Lars, there is a difference between the uncertainty concept developed by Keynes and the one developed by Knight.

As I have pointed out, Keynes’s concept of uncertainty involves a nonergodic stochastic process . On the other hand, Knight’s uncertainty — like Taleb’s black swan — assumes an ergodic process. The difference is the for Knight (and Taleb) the uncertain outcome lies so far out in the tail of the unchanging (over time) probability distribution that it appears empirically to be [in Knight’s terminology] “unique”. In other words, like Taleb’s black swan, the uncertain outcome already exists in the probability distribution but is so rarely observed that it may take several lifetimes for one observation — making that observation “unique”.

In the latest edition of Taleb’s book , he was forced to concede that philosophically there is a difference between a nonergodic system and a black swan ergodic system –but then waves away the problem with the claim that the difference is irrelevent.

LPS:

Paul, on the whole, I think you’re absolutely right on this. Knight’s uncertainty concept has an epistemological founding and Keynes’s definitely an ontological founding. Of course this also has repercussions on the issue of ergodicity in a strict methodological and mathematical-statistical sense. I think Keynes’s view is the most warranted of the two.

BUT – from a “practical” point of view I have to agree with Taleb. Because if there is no reliable information on the future, whether you talk of epistemological or ontological uncertainty, you can’t calculate probabilities.

The most interesting and far-reaching difference between the epistemological and the ontological view is that if you subscribe to the former, knightian view – as Taleb and “black swan” theorists basically do – you open up for the mistaken belief that with better information and greater computer-power we somehow should always be able to calculate probabilities and describe the world as an ergodic universe. As both you and Keynes convincingly have argued, that is ontologically just not possible.

PD:

Lars, your last sentence says it all. If you believe it is an ergodic system and epistemology is the only problem, then you should urge more transparency , better data collection, hiring more “quants” on Wall Street to generate “better” risk management computer problems, etc — and above all keep the government out of regulating financial markets — since all the government can do is foul up the outcome that the ergodic process is ready to deliver.

Long live Stiglitz and the call for transparency to end asymmetric information — and permit all to know the epistemological solution for the ergodic process controlling the economy.

Or as Milton Friedman would say, those who make decisions “as if” they knew the ergodic stochastic process create an optimum market solution — while those who make mistakes in trying to figure out the ergodic process are like the dinosaurs, doomed to fail and die off — leaving only the survival of the fittest for a free market economy to prosper on. The proof is why all those 1% far cats CEO managers in the banking business receive such large salaries for their “correct” decisions involving financial assets.

Alternatively, if the financial and economic system is non ergodic then there is a positive role for government to regulate what decision makers can do so as to prevent them from mass destruction of themselves and other innocent bystanders — and also for government to take positive action when the herd behavior of decision makers are causing the economy to run off the cliff.

So this distinction between ergodic and nonergodic is essential if we are to build institutional structures that make running off the cliff almost impossible. — and for the government to be ready to take action when some innovative fool(s) discovers a way to get around institutional barriers and starts to run the economy off the cliff.

To Keynes the source of uncertainty was in the nature of the real – nonergodic – world. It had to do, not only – or primarily – with the epistemological fact of us not knowing the things that today are unknown, but rather with the much deeper and far-reaching ontological fact that there often is no firm basis on which we can form quantifiable probabilites and expectations.

Has “modern” macroeconomics delivered?

29 March, 2012 at 14:37 | Posted in Economics | Comments Off on Has “modern” macroeconomics delivered?

Jonathan Schlefer, research associate at Harvard Business School, has written a new book – The Assumptions Economists Make – on what “modern” macroeconomics has delivered during the last few decades. Although Schlefer shares Milton Friedman’s instrumentalist view on theories and models, he can’t really see that they have delivered what they promised – useful predictions. Justin Fox summarizes:

By that standard, here are Schlefer’s judgments on the succession of theories that have dominated academic macroeconomics since the 1970s:

Rational expectations (which proposed that we’re all too smart to be fooled by money-printing central bankers and deficit-spending governments): Intellectually interesting, and maybe helpful in “normal times,” whatever those are. But not very good at describing or predicting the actual behavior of the economy at any time, and worthless in a crisis.

Real business-cycle theory (which says that economic ups and downs are all caused by technology-induced changes in productivity): “[N]ot only are these models a tautology — they are a tautology that turns out to be wrong. They say that employment rises or falls because actors choose to work more when productivity is high and less when it’s low. This is nuts.”

DSGE (sometimes called “New Keynesian”) models: Not “quite as bad as they sound,” as they do describe an economy that moves along by fits and starts. They just don’t leave room for any crazy stuff.

Keen answers Krugman

29 March, 2012 at 11:42 | Posted in Economics | 4 Comments

Steve Keen has answered Krugman.
To the point.
Brief.
Read it!

Neoclassical economics – visionless and escaping responsibility

28 March, 2012 at 23:29 | Posted in Economics, Theory of Science & Methodology | Comments Off on Neoclassical economics – visionless and escaping responsibility

A while ago Roger Backhouse and Bradley Bateman had a nice piece in New York Times on the lack of perspectives and alternatives shown by mainstream economists when dealing with the systemic crises of modern economies:

Economists do much better when they tackle small, well-defined problems. As John Maynard Keynes put it, economists should become more like dentists: modest people who look at a small part of the body but remove a lot of pain.

However, there are also downsides to approaching economics as a dentist would: above all, the loss of any vision about what the economic system should look like. Even Keynes himself was driven by a powerful vision of capitalism. He believed it was the only system that could create prosperity, but it was also inherently unstable and so in need of constant reform. This vision caught the imagination of a generation that had experienced the Great Depression and World War II and helped drive policy for nearly half a century …

In the 20th century, the main challenge to Keynes’s vision came from economists like Friedrich Hayek and Milton Friedman, who envisioned an ideal economy involving isolated individuals bargaining with one another in free markets. Government, they contended, usually messes things up. Overtaking a Keynesianism that many found inadequate to the task of tackling the stagflation of the 1970s, this vision fueled neoliberal and free-market conservative agendas of governments around the world.

THAT vision has in turn been undermined by the current crisis. It took extensive government action to prevent another Great Depression, while the enormous rewards received by bankers at the heart of the meltdown have led many to ask whether unfettered capitalism produced an equitable distribution of wealth. We clearly need a new, alternative vision of capitalism. But thanks to decades of academic training in the “dentistry” approach to economics, today’s Keynes or Friedman is nowhere to be found.

And now Philip Mirowski has an equally interesting article on opendemocracy.net explaining why neoclassical economists

don’t seem to have suffered one whit for the subsequent sequence of events, a slow-motion train wreck that one might reasonably have expected would have rubbished the credibility of lesser mortals.

Read it!

Centerpartiet och lärarlönerna

27 March, 2012 at 10:41 | Posted in Economics, Education & School | Comments Off on Centerpartiet och lärarlönerna

Det nyliberala centerpartiet – med en devot Margaret Thatcher och Ayn Rand beundrande Annie Lööf i spetsen – har föreslagit att ingångslönerna skall sänkas för att skapa jobb. Som jag skrivit om tidigare är detta en helt galen idé, med liten eller ingen förankring i ekonomisk klokskap.

Men man kan också undra över hur det står till med partiets ideologiska renlärighet. Centern hävdar nämligen också att staten ska ha en aktiv roll i lönebildningsprocessen. Det låter inte så varken liberalt eller nyliberalt. Och varför anammar inte partiet samma inställning när det gäller skolan och lärarlönerna? Hade det inte varit konsekvent att också kräva statlig styrning av lärarlönerna? Öronmärkning av statliga bidrag till kommunerna (så att pengar villkoras till att också innebära lärarlönehöjningar)? Återförstatligande av skolan? Jag bara undrar.

Sänkta löner löser ingenting

24 March, 2012 at 14:53 | Posted in Economics | 14 Comments

För varje månad som går vinner konjunkturnedgången i Sverige i oavbruten styrka. De närmsta åren hotar bjuda på en långvarig konjunkturnedgång i den svenska ekonomin.

I detta minst sagt svåra läge – när ekonomin åter börjar hänga på repen – har vi de senaste veckorna kunnat läsa hur man från både regerings- och arbetsgivare-håll föreslagit att man ska försöka lösa krisen med hjälp av lönesänkningar i en eller annan form (inte minst ungdomslönerna har varit på tapeten).

Egentligen ska man nog främst se detta som ett tecken på hur lågt förtroendet för det ekonomiska systemet sjunkit. Från gårdagens nyliberala våtdrömmar om ekonomins kommandohöjder har vi sjunkit till den ekonomiska verklighetens bankkrascher, företagsnedläggelser och galopperande arbetslöshet. För det är ju självklart inte så att lönesänkningar räddar jobb. I en situation där krisen som började kring 2007-08 långt ifrån är över, varken globalt eller här hemma, har vi mer än något annat behov av stimulansåtgärder och en ekonomisk politik som leder till ökad effektiv efterfrågan.

På samhällsnivå ökar lönesänkningar bara risken för att ännu fler kommer att stå utan jobb. Att tro sig kunna lösa kriser på detta sätt är en tillbakagång till den förfelade ekonomiska teori och politik som John Maynard Keynes slutgiltigt gjorde upp med redan på 1930-talet. Det var en politik som gjorde miljontals människor världen över arbetslösa. Trettiotalsdepressionen åtföljdes av deflation som visserligen kunde innebära höjda reallöner – men bara för dem som lyckades behålla sina jobb.

Visst kan det kortsiktigt fungera för enskilda företag och fackföreningar att satsa på frysta eller sänkta löner. Men det är ett atomistiskt felslut att tro att en generell lönesänkningspolitik skulle främja ekonomin. Tvärtom. Som Keynes visade blir den aggregerade effekten av lönesänkningar katastrofal. De sätter igång en kumulativ prissänknings-spiral som får företags och enskildas skulder att realt öka eftersom skulderna nominellt inte påverkas av den allmänna pris- och löneutvecklingen. I en ekonomi som alltmer kommit att bygga på ökat låntagande och skuldsättning blir detta inkörsporten till en deflationskris. Detta i sin tur leder till att ingen vill låna pengar och kapital eftersom betalningsbördan över tiden blir för betungande. Företag blir insolventa, investeringarna minskar, arbetslösheten ökar och depressionen står för dörren.

Låt oss tala. Och tala tydligt. Risken för skulddeflation kan aldrig bagatelliseras. Ser vi på svenska data över inflationsförändring och BNP för åren 1980-2008 kan vi konstatera att för varje punkt som den reala BNP understiger den potentiella BNP så tenderar inflationstakten att minska med en halv punkt. Detta måste tas på fullt allvar. Om gapet mellan real och potentiell BNP ökar de närmsta åren innebär detta att deflationen snart riskerar bli ett faktum.

Det överhängande problemet för svensk ekonomi är att vi inte får fart på konsumtion och kreditgivning. Förtroende och effektiv efterfrågan måste återupprättas. Vi kan självklart inte bara sitta med armarna i kors och vänta på att ovädret drar vidare – men att föreslå krislösning byggd på sänkta löner är att skriva ut recept på än värre katastrofer. Vill regering och arbetsgivare ta samhällsansvar – och inte låta enfald få ersätta analysförmåga och besinning – måste de klart och tydligt ta avstånd från allehanda kortsiktiga och kontraproduktiva lönesänkningsstrategier. Skall man ta sig ur djupa ekonomiska svackor krävs vassare, bättre och effektivare instrument – som exempelvis en aktiv och offensiv finanspolitik.

Austerity in depressed economies

23 March, 2012 at 11:44 | Posted in Economics | Comments Off on Austerity in depressed economies

Paul Krugman has a niece piece today on his blog, commenting on the recently published paper by Brad DeLong and Larry Summers on fiscal policy in a depressed economy.
Even though the long-run effects of austerity policies may be minimal, they certainly impose large costs here and now. Self-evident conclusion: stop cutting and start stimulate, at least as long as we are stilled trapped in a liquidity trap.

I’ve been posting various versions of a scatterplot showing the relationship between one indicator of fiscal policy and growth since the crisis began. Here’s a version restricted to eurozone countries and countries maintaining a fixed exchange rate against the euro, with many of the countries labeled:

(All data from Eurostat).

Still, is this the kind of outcome you would have expected if you believed what the Austerians were saying? Or is it what you would have expected if you’d been reading those of us horrified by the turn to austerity?

What gets me about all of this is the incredible, unwarrented arrogance of the austerians. They decided that they knew better than textbook macroeconomics, even though none of them had predicted the crisis or even seen the possibility of such a crisis.

And the wreckage now lies all around us.

New-Keynesian macroeconomics and involuntary unemployment

21 March, 2012 at 16:25 | Posted in Economics | 7 Comments

People calling themselves “new-Keynesians” – a gross misnomer – ought to be rather embarrassed by the fact that the kind of microfounded dynamic stochastic general equilibrium models they use, cannot incorporate such a basic fact of reality as involuntary unemployment!

Of course, working with representative agent models, this should come as no surprise. If one representative agent is employed, all representative agents are. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility.

Being a “new-Keynesian” it ought to be of interest to know what Keynes had to say on the issue. In General Theory (1937, chapter 2), he writes: 

The classical school [maintains that] while the demand for labour at the existing money-wage may be satisfied before everyone willing to work at this wage is employed, this situation is due to an open or tacit agreement amongst workers not to work for less, and that if labour as a whole would agree to a reduction of money-wages more employment would be forthcoming. If this is the case, such unemployment, though apparently involuntary, is not strictly so, and ought to be included under the above category of ‘voluntary’ unemployment due to the effects of collective bargaining, etc …

The classical theory … is best regarded as a theory of distribution in conditions of full employment. So long as the classical postulates hold good, unemployment, which is in the above sense involuntary, cannot occur. Apparent unemployment must, therefore, be the result either of temporary loss of work of the ‘between jobs’ type or of intermittent demand for highly specialised resources or of the effect of a trade union ‘closed shop’ on the employment of free labour. Thus writers in the classical tradition, overlooking the special assumption underlying their theory, have been driven inevitably to the conclusion, perfectly logical on their assumption, that apparent unemployment (apart from the admitted exceptions) must be due at bottom to a refusal by the unemployed factors to accept a reward which corresponds to their marginal productivity …

Obviously, however, if the classical theory is only applicable to the case of full employment, it is fallacious to apply it to the problems of involuntary unemployment – if there be such a thing (and who will deny it?). The classical theorists resemble Euclidean geometers in a non-Euclidean world who, discovering that in experience straight lines apparently parallel often meet, rebuke the lines for not keeping straight – as the only remedy for the unfortunate collisions which are occurring. Yet, in truth, there is no remedy except to throw over the axiom of parallels and to work out a non-Euclidean geometry. Something similar is required to-day in economics. We need to throw over the second postulate of the classical doctrine and to work out the behaviour of a system in which involuntary unemployment in the strict sense is possible.

The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than “hand waving” that give us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

 

Economics and Reality

19 March, 2012 at 20:23 | Posted in Economics, Theory of Science & Methodology | Comments Off on Economics and Reality

“Modern” economics has become increasingly irrelevant to the understanding of the real world. In his seminal book Economics and Reality (1997) Tony Lawson traced this irrelevance to the failure of economists to match their deductive-axiomatic methods with their subject.
It is – sad to say – as relevant today as it was fifteen years ago.
It is still a fact that within mainstream economics internal validity is everything and external validity – nothing. Why anyone should be interested in that kind of theories and models is beyond my imagination. As long as mainstream economists do not come up with any export-licenses for their theories and models to the real world in which we live, they really should not be surprised if people say that this is not science, but autism!
Studying mathematics and logics is interesting and fun. It sharpens the mind. In pure mathematics and logics we do not have to worry about external validity. But economics is not pure mathematics or logics. It’s about society. The real world. Forgetting that, economics is really in dire straits.

Economics and Reality was a great inspiration to me fifteen years ago. It still is.

Dansklektion (II)

18 March, 2012 at 19:44 | Posted in Varia | Comments Off on Dansklektion (II)

“Modern” macroeconomics and uncertainty

17 March, 2012 at 16:26 | Posted in Economics, Statistics & Econometrics, Theory of Science & Methodology | 1 Comment

The financial crisis of 2007-08 hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even make it conceivable?

There are many who have ventured to answer this question. And they have come up with a variety of answers, ranging from the exaggerated mathematization of economics, to irrational and corrupt politicians. 

But the root of our problem goes much deeper. It ultimately goes back to how we look upon the data we are handling. In “modern” macroeconomics – dynamic stochastic general equilibrium, new synthesis, new-classical and new-Keynesian -variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the “data-generating process” – if we do not have the “true” model – the whole edifice collapses. And of course it has to. I mean, who really honestly believes that we should have access to this mythical Holy Grail, the data-generating process?

“Modern” macroeconomics obviously did not anticipate the enormity of the problems that unregulated “efficient” financial markets created. Why? Because it builds on the myth of us knowing the “data-generating process” and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30%, and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But as Keynes convincingly argued in his monumental Treatise on Probability (1921), this is no always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally  good – model the chance is perhaps somewhere around 40%.  We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

In the end this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only “rational” one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works. 

Some macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better – how much bigger chance that we do not lull us into the comforting thought that we know everything  and that everything is measurable and we have everything under control – if instead we could just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing – economic catastrophy!

The state of microfoundations and macroeconomics – Robert Solow says it all

16 March, 2012 at 15:15 | Posted in Economics, Theory of Science & Methodology | 3 Comments

The purported strength of new-classical and new-Keynesian macroeconomics is that they have firm anchorage in preference-based microeconomics, and especially the decisions taken by inter-temporal utility maximizing “forward-loooking” individuals.

To some of us, however, this has come at too high a price. The almost quasi-religious insistence that macroeconomics has to have microfoundations – without ever presenting neither ontological nor epistemological justifications for this claim – has put a blind eye to the weakness of the whole enterprise of trying to depict a complex economy based on an all-embracing representative actor equipped with superhuman knowledge, forecasting abilities and forward-looking rational expectations. It is as if – after having swallowed the sour grapes of the Sonnenschein-Mantel-Debreu-theorem – these economists want to resurrect the omniscient walrasian auctioneer in the form of all-knowing representative actors equipped with rational expectations and assumed to somehow know the true structure of our model of the world (how that could even be conceivable is beyond my imagination, given that the ongoing debate on microfoundations, if anything, shows that not even we, the economists, can come to agreement on a common model).

Following the greatest economic depression since the 1930s, the grand old man of modern economic growth theory, Nobel laureate Robert Solow, on July 20, 2010, gave a prepared statement on “Building a Science of Economics for the Real World” for a hearing in the U. S. Congress. According to Solow modern macroeconomics has not only failed at solving present economic and financial problems, but is “bound” to fail. Building dynamically stochastic general equilibrium models (DSGE) on “assuming the economy populated by a representative agent” – consisting of “one single combination worker-owner-consumer-everything-else who plans ahead carefully and lives forever” – do not pass “the smell test: does this really make sense?” One cannot but concur in Solow’s surmise that a thoughtful person “faced with the thought that economic policy was being pursued on this basis, might reasonably wonder what planet he or she is on.”

Already in 2008 Solow had – in “The State of Macroeconomics” (Journal of Economic Perspectives 2008:243-249) – told us of what he thought of microfounded “modern macro”:

[When modern macroeconomists] speak of macroeconomics as being firmly grounded in economic theory, we know what they mean … They mean a macroeconomics that is deduced from a model in which a single immortal consumer-worker-owner maximizes a perfectly conventional time-additive utility function over an infinite horizon, under perfect foresight or rational expectations, and in an institutional and technological environment that favors universal price-taking behavior …

No one would be driven to accept this story because of its obvious “rightness”. After all, a modern economy is populated by consumers, workers, pensioners, owners, managers, investors, entrepreneurs, bankers, and others, with different and sometimes conflicting desires, information, expectations, capacities, beliefs, and rules of behavior … To ignore all this in principle does not seem to qualify as mere abstraction – that is setting aside inessential details. It seems more like the arbitrary suppression of clues merely because they are inconvenient for cherished preconceptions …

Friends have reminded me that much effort of ‘modern macro’ goes into the incorporation of important deviations from the Panglossian assumptions … [But] a story loses legitimacy and credibility when it is spliced to a simple, extreme, and on the face of it, irrelevant special case. This is the core of my objection: adding some realistic frictions does not make it any more plausible than an observed economy is acting out the desires of a single, consistent, forward-looking intelligence …

It seems to me, therefore, that the claim that ‘modern macro’ somehow has the special virtue of following the principles of economic theory is tendentious and misleading … The other possible defense of modern macro is that, however special it may seem, it is justified empirically. This strikes me as a delusion …

So I am left with a puzzle, or even a challenge. What accounts for the ability of ‘modern macro’ to win hearts and minds among bright and enterprising academic economists? … There has always been a purist streak in economics that wants everything to follow neatly from greed, rationality, and equilibrium, with no ifs, ands, or buts … The theory is neat, learnable, not terribly difficult, but just technical enough to feel like ‘science’. Moreover it is practically guaranteed to give laissez-faire-type advice, which happens to fit nicely with the general turn to the political right that began in the 1970s and may or may not be coming to an end.

Earlier this week, new-Keynesian macroeconomist Simon Wren-Lewis asked me to explain why other ways of doing macro have (purportedly) died out and the microfoundations approach has become so dominant. In my admittedly tentative answer I wrote.

(1) One could of course say that one reason why the microfundations approach is so dominant is – as Krugman has it on his blog today – “trying to embed your ideas in a microfounded model can be a very useful exercise — not because the microfounded model is right, or even better than an ad hoc model, but because it forces you to think harder about your assumptions, and sometimes leads to clearer thinking”.But I don’t really believe that is an especially important reason on the whole. I mean, if people put that enormous amount of time and energy that they do into constructing macroeconomic models, then they really have to be substantially contributing to our understanding and ability to explain and grasp real macroeconomic processes. If not, they should – after somehow perhaps being able to sharpen our thoughts – be thrown into the waste-paper-basket (something the father of macroeconomics, Keynes, used to do), and not as today, being allowed to overrun our economics journals and giving their authors lots of academic prestige.

(2) A more plausible reason is that microfoundations is in line with the reductionism inherent in the metohodological individaulism that almost all neoclassical economists subscribe to. And as e.g. argued by Johan Åkerman and Ekkehart Schlicht this is deeeply problematic for a macroeconomics trying to solve the “summation problem” without nullifying the possibility of emergence.

(3)It is thought to give macroeconomists the means to fully predetermine their models and come up with definitive, robust, stable, answers. In reality we know that the forecasts and expectations of individuals often differ systematically from what materialize in the aggregate, since knowledge is imperfect and uncertainty – rather than risk – rules the roost.

(4) Microfoundations allegedly goes around the Lucas critique by focussing on “deep” structural, invariant parameters of optimizing individuals’ preferences and tastes. As I have argued, this is an empty hope without solid empirical or methodological foundation.

The kind of microfoundations that “new-Keynesian” and new-classical general equilibrium macroeconomists are basing their models on, are not – at least from a realist point of view – plausible.

As all students of economics know, time is limited. Given that, there has to be better ways to optimize its utilization than spending hours and hours working through or constructing irrelevant macroeconomic models founded on microfoundations more chosen from considerations of mathematical tractability than applying to reality. I would rather recommend my students allocating their time into constructing better, real and relevant macroeconomic models – models that really help us to explain and understand reality.

Of course, I could just as well have directed Wren-Lewis to Robert Solow’s article. There the answer to his question was given already four years ago.

Ask the mountains

15 March, 2012 at 19:24 | Posted in Varia | Comments Off on Ask the mountains

Ginin – ett klargörande

14 March, 2012 at 18:11 | Posted in Statistics & Econometrics | Comments Off on Ginin – ett klargörande

Några studenter har hört av sig och påpekat att Daniel Waldenström på ekonomistas idag också har ett diagram över Ginikoefficientens utveckling i Sverige – men att diagrammet skiljer sig från det jag tidigare idag lade ut i artikeln Inequality in Sweden continues to increase.

Anledningen till diskrepansen är dock inte att någon av oss skulle räknat fel, utan helt enkelt att SCB, för att möjliggöra den långa  tidsserie som jag använt mig av, där använder ett äldre hushållsbegrepp och en äldre definition av disponibel inkomst än i den kortare tidsserie som Daniel bygger på. C’est tout!

Krugman only gets it partly right in the microfoundations debate

14 March, 2012 at 15:02 | Posted in Economics, Theory of Science & Methodology | 1 Comment

Paul Krugman has another piece on microfoundations – the absolutely fundamental issue that I’ve been discussing with e. g. Simon Wren-Lewis during the last couple of weeks – on his blog today. Krugman writes (my italics):

I upset even New Keynesians, such as Simon Wren-Lewis, with my observation that the crusade for microfoundations has had only one success, the prediction of stagflation after an extended period of high inflation, and that this success is 35 years old.

Yet I stand by that statement.

Let me be clear about what I mean in saying that. I don’t mean that setting up and working out microfounded models is a waste of time. On the contrary, trying to embed your ideas in a microfounded model can be a very useful exercise — not because the microfounded model is right, or even better than an ad hoc model, but because it forces you to think harder about your assumptions, and sometimes leads to clearer thinking. In fact, I’ve had that experience several times: I was convinced that the liquidity trap was wrong until I did a miniature NK model and saw that it made sense, my work with Gauti Eggertsson on deleveraging also uses NK modeling to clarify matters.

But bear in mind what we’ve actually seen in academic economics: the development of an ethos in which only microfounded models are considered “real” theory, in which it’s basically impossible to publish a paper unless it’s intertemporal optimization all the way. That’s the kind of dominance a theory is only entitled to if it produces dramatically better predictions than the theory it has crowded out: Light bend! The sea floor spreads! And that just hasn’t happened; there has, I repeat, been only one significant predictive success of this kind from microfoundations, and that happened a very long time ago.

Although I certainly agree with Krugman’s harsh judgement on microfoundations of macroeconomics – and it goes for both the “new-Keynesian” and new-classical variety – I am however not convinced by the argument that “trying to embed your ideas in a microfounded model can be a very useful exercise — not because the microfounded model is right, or even better than an ad hoc model, but because it forces you to think harder about your assumptions, and sometimes leads to clearer thinking”.

Why? Because it smacks to much of l’art pour l’art.

I mean, if people put that enormous amount of time and energy that they do into constructing macroeconomic models, then they really have to be substantially contributing to our understanding and ability to explain and grasp real macroeconomic processes. If not, they should – after somehow perhaps being able to sharpen our thoughts – be thrown into the waste-paper-basket (something the father of macroeconomics, Keynes, used to do), and not as today, being allowed to overrun our economics journals and giving their authors celestial academic prestige.

Krugman’s explications on this issue is really interesting also because they shed light on a kind of inconsistency in his art of argumentation. During a couple of years Krugman has in more than one article criticized mainstream economics for using to much (bad) mathematics and axiomatics in their model-building endeavours. But when it comes to defending his own position on various issues he usually himself ultimately falls back on the same kind of models. This shows up also in the citation above, where he refers to the work he has done with Eggertsson – work that actually, when it comes to methodology and assumptions, has a lot in common with the kind of model-building he otherwise criticizes.

On most macroeconomic policy discussions I find myself in agreement with Krugman. To me that just shows that Krugman is right in spite of and not thanks to those models he ultimately refers to. When he is discussing austerity measures, ricardian equivalence or problems with the euro, he is actually not using those models, but rather simpler and more adequate and relevant thought-constructions in the vein of Keynes.

As all students of economics know, time is limited. Given that, there has to be better ways to optimize its utilization than spending hours and hours working through or constructing irrelevant macroeconomic models just beacuase they “sometimes leads to clearer thinking.” I would rather recommend my students allocating their time into constructing better, real and relevant macroeconomic models – models that really help us to explain and understand reality.

Inequality in Sweden continues to increase

14 March, 2012 at 09:47 | Posted in Economics, Statistics & Econometrics | Comments Off on Inequality in Sweden continues to increase

New data from Statistics Sweden show that inequality continues to increase in Sweden:

 
  
I would say that what we see happen in Sweden is deeply disturbing. The rising inequality has probably to do with income and wealth increasingly being concentrated in the hands of a very small and privileged elite.

Microfoundations – of course there is an alternative!

13 March, 2012 at 21:37 | Posted in Economics | Comments Off on Microfoundations – of course there is an alternative!

In the conclusion to his book Models of Business Cycles (1987), Robert Lucas (in)famously wrote (p. 66 & 107-08):

It is remarkable and, I think, instructive fact that in nearly 50 years that Keynesian tradition has produced not one useful model of the individual unemployed worker, and no rationale for unemployment insurance beyond the observation that, in common with countercyclical cash grants to corporations or to anyone else, it has the effects of increasing the total volume of spending at the right times.  By dogmatically insisting that unemployment be classed as ‘involuntary’ this tradition simply cut itself off from serious thinking about the actual options unemployed people are faced with, and hence from learning anything about how the alternative social arrangements might improve these options.

The most interesting recent developments in macroeconomic theory seem to me describable as the reincorporation of aggregative problems such as inflation and the business cycle within the general framework of ‘microeconomic’ theory.  If these developments succeed, the term ‘macroeconomics’ will simply disappear from use and the modifier ‘micro’ will become superfluous.  We will simple speak, as did Smith, Ricardo, Marshall and Walras, of economic theory.  If we are honest, we will have to face the fact that at any given time there will be phenomena that are well-understood from the point of view of the economic theory we have, and other phenomena that are not.  We will be tempted, I am sure, to relieve the discomfort induced by discrepancies between theory and facts by saying the ill-understood facts are the province of some other, different kind of economic theory.  Keynesian ‘macroeconomics’ was, I think, a surrender (under great duress) to this temptation.  It led to the abandonment, for a class of problems of great importance, of the use of the only ‘engine for the discovery of truth’ that we have in economics.

Thanks to latter-day Lucasian new-classical-new-Keynesian-rational-expectations-representative-agents-microfoundations-economists, we are supposed not to – as our primitive ancestors – use that archaic term ‘macroeconomics’ anymore (with the possible exception of warning future economists not to give in to ‘discomfort.’)  Being intellectually heavily indebted to the man who invented macroeconomics – Keynes – yours truly firmly declines to concur.

Microfoundations – and a fortiori rational expectations and  representative agents – serve a particular theoretical purpose. And as the history of macroeconomics during the last thirty years has shown, this Lakatosian microfoundation programme for macroeconomics is only methodologically consistent within the framework of a (deterministic or stochastic) general equilibrium analysis. In no other context has it been possible to incorporate these kind of microfoundations, with its “forward-looking optimizing individuals,” into macroeconomic models.

This is of course not by accident. General equilibrium theory is basically nothing else than an endeavour to consistently generalize the microeconomics of individuals and firms on to the macroeconomic level of aggregates.  

But it obviously doesn’t work. The analogy between microeconomic behaviour and macroeconomic behaviour is misplaced. Empirically, science-theoretically and methodologically, neoclassical microfoundations for macroeconomics are – as I have argued in a couple of blog-posts the last weeks – defective.  Tenable foundations for macroeconomics really have to be sought for elsewhere.

In his latest post on the subject, Simon Wren-Lewis rhetorically asks:

Microfoundations – is there an alternative? 

Of course there is an alternative to neoclassical general equilibrium microfoundations! Behavioural economics and Goldberg & Frydman’s “imperfect knowledge” economics being two noteworthy examples that easily come to mind. 

And for those of us who have not forgotten the history of our discipline, and not bought the sweet-water nursery tale of Lucas et consortes that Keynes was not “serious thinking,” we can easily see that there exists a macroeconomic tradition inspired by Keynes (that has absolutely nothing to do with any “new synthesis” or “new-Keynesianism” to do).

Its ultimate building-block is the perception of genuine uncertainty and that people often “simply do not know.” Real actors can’t know everything and their acts and decisions are not simply possible to sum or aggregate without the economist risking to succumb to “the fallacy of composition”.

Instead of basing macroeconomics on unreal and unwarranted generalizations of microeconomic behaviour and relations, it is far better to accept the ontological fact that the future to a large extent is uncertain, and rather conduct macroeconomics on this fact of reality.

The real macroeconomic challenge is to accept uncertainty and still try to explain why economic transactions take place – instead of simply conjuring the problem away by assuming uncertainty to be reducible to stochastic risk. That is scientific cheating. And it has been going on for too long now.

The Keynes-inspired building-blocks are there. But it is admittedly a long way to go before the whole construction is in place. But the sooner we are intellectually honest and ready to admit that the microfoundationalist programme has come to way’s end – the sooner we can redirect are aspirations and knowledge in more fruitful endeavours.

Eurons stryptag på Spaniens ekonomi

13 March, 2012 at 15:21 | Posted in Economics, Politics & Society | Comments Off on Eurons stryptag på Spaniens ekonomi

Antagligen ett grabbigt skämt, men kanske säger Jean-Claude Junckers – ledare för Eurogruppen av finansministrar – stryptag på Spaniens näringsminister Luis de Guindos något om eurosamarbetets gränser idag ….

Lärarlönerna – en uppdatering

12 March, 2012 at 19:32 | Posted in Economics, Education & School | 5 Comments

SCB har jobbat hårt på att digitalisera sina tryckta publikationer. Som forskare kan man bara bocka och buga. Nu finns sedan några veckor tillbaka digitalt tillgängligt via webben t ex Statistisk årsbok för Sverige (1914-), Sveriges officiella statistik i sammandrag (1870-1913), Statistiska Meddelanden (SM), Bidrag till Sveriges officiella statistik (BiSOS), och inte minst Serien Historisk Statistik för Sverige.

Suveränt!

Jag passade så klart genast på att revidera och uppdatera några av mina löneserier. Men när man tittar på dem känns det nog inte lika suveränt om man är lärare …
 

          Källa: SCB, LR och egna beräkningar (KPI 1949 = 100)

Wren-Lewis – an update

12 March, 2012 at 11:30 | Posted in Economics, Theory of Science & Methodology | Comments Off on Wren-Lewis – an update

As mentioned yesterday, Simon Wren-Lewis – economics professor at Oxford University – has a post up, responding to my critique of his and Paul Krugmans’s stance on the alleged need for microfoundations in macroeconomics.

As you all know, yours truly usually side up with both Krugman and Wren-Lewis on the ongoing macroeconomic policy debate. But on the microfoundations issue – which is more of a methodological kind – I am still not convinced that it suffices to say – as Wren-Lewis – that “we need to model expectations by some means,” and that rational expectations should do, just because it allows the macroeconomist to “think about expectations errors in a structural way.”

Wren-Lewis also mentions that Michael Woodford, in his recent defence of microfoundations methodology (here) sees rational expectations as one of its main weakness, but also hopes that learning models is the way forward. I really think there is no other justification for that belief except pure hope – and that is not enough in science.

As I have previously argued (here and here), rational expectations presupposes – basically for reasons of consistency – that agents have complete knowledge of all of the relevant probability distribution functions. And when trying to incorporate learning in these models – trying to take the heat of some of the criticism launched against it up to date – it is always a very restricted kind of learning that is considered. A learning where truly unanticipated, surprising, new things never take place, but only rather mechanical updatings – increasing the precision of already existing information sets – of existing probability functions.

Nothing really new happens in these ergodic models, where the statistical representation of learning and information is nothing more than a caricature of what takes place in the real world target system. This follows from taking for granted that people’s decisions can be portrayed as based on an existing probability distribution, which by definition implies the knowledge of every possible event (otherwise it is in a strict mathematical-statistically sense not really a probability distribution) that can be thought of taking place.

But in the real world it is – as shown again and again by behavioural and experimental economics – common to mistake a conditional distribution for a probability distribution. Mistakes that are impossible to make in the kinds of economic analysis – built on rational expectations – that most of modern neoclassical macroeconomics is performing. On average agents with rational expectations are always correct. But truly new information will not only reduce the estimation error but actually change the entire estimation and hence possibly the decisions made. To be truly new, information has to be unexpected. If not, it would simply be inferred from the already existing information set.

In rational expectations models new information is typically presented as something only reducing the variance of the parameter estimated. But if new information means truly new information it actually could increase our uncertainty and variance (information set [A, B] => [A, B, C]).

Truly new information give birth to new probabilities, revised plans and decisions – something the macroeconomic models building on rational expectations cannot account for with its finite sampling representation of incomplete information.

In the world of rational expectations, learning is like being better and better at reciting the complete works of Shakespeare by heart – or at hitting bull’s eye when playing dart. It presupposes that we have a complete list of the possible states of the world and that by definition mistakes are non-systematic (which, strictly seen, follows from the assumption of “subjective” probability distributions being equal to the “objective” probability distribution). This is a rather uninteresting and trivial kind of learning. It is a closed world learning, synonymous to improving one’s adaptation to a world which is fundamentally unchanging. But in real, open world situations, learning is more often about adapting and trying to cope with genuinely new phenomena.

Rational expectations presumes consistent behaviour, where expectations do not display any persistent errors. In the world of rational expectations we are always, on average, hitting the bull’s eye. In the more realistic, open systems view, there is always the possibility (danger) of making mistakes that may turn out to be systematic. It is because of this, presumably, that we put so much emphasis on learning in our modern knowledge societies.

So – to round off – before modern neoclassical macroeconomics has really come to adequately deal with this fundamental flaw in the way it portrays human decisions, choices and genuine uncertainty (of which true novelty, of course, is one instantiation) I will remain dubious of the value of those macromodels “new keynesian” and “new classical” construct on such unwarranted microfoundations.

I  agree with Wren-Lewis that “we need to model expectations by some means.”  But there has to be better ways than by an ill-founded modelling hypothesis of so called rational expectations. If not, macroeconomics is really in dire straits!

Set fire to the rain

11 March, 2012 at 21:59 | Posted in Varia | Comments Off on Set fire to the rain


This one is for you, J.M.

Wren-Lewis responding to my critique of microfoundations

11 March, 2012 at 19:20 | Posted in Economics | Comments Off on Wren-Lewis responding to my critique of microfoundations

I just noticed that Simon Wren-Lewis – economics professor at Oxford University – has a post up today, responding to my critique of his and Paul Krugmans’s stance on the alleged need for microfoundations in macroeconomics. I am still not convinced that it suffices to say – as Wren-Lewis – that “we need to model expectations by some means,” and that rational expectations should do, just because it allows the macroeconomist to “think about expectations errors in a structural way.” A full argumentation for why I consider this inadequate has, however, to way for tomorrow, since I am “on the wing” today.

Jane Eyre

10 March, 2012 at 23:56 | Posted in Varia | 3 Comments

Jag vet inte hur ni tillbringade kvällen. För mig var valet mellan Melodifestivalen och filmatiseringen av Brontës klassiska Jane Eyre med en bedårande Mia Wasikowska i huvudrollen iallafall enkelt.

The Swedish One Percent – Krugman is wrong!

10 March, 2012 at 18:41 | Posted in Economics, Politics & Society | Comments Off on The Swedish One Percent – Krugman is wrong!

Paul Krugman today has a post on Sweden and the rising inequality:

[Y]you have no business talking about international income distribution if you don’t know about the invaluable World Top Incomes Database. What does this database tell us about Sweden versus America?

DESCRIPTION

Hey, it looks just the same — or, actually, not.

Yes, the top one percent has risen a bit in Sweden. But how anyone could look at this and see the story as similar boggles the mind.

It is not often that yours truly disagree with Krugman, but here I think he is wrong. It is indeed possible to see the story as similar.

Why? Look at the graphs below (all posted earlier here on the blog): 

 

The average annual percentage growth rate 1981-2007 was 2.1% in Sweden (in UK  and in the US: 2.9%). To me that is an indication that Sweden is also experiencing growing inequality to a notable extent.

Also look at this plot (based on data  from The Top Incomes Database):

During the last sixty years the top income shares in Sweden, the United-Kingdom and the United States have developed like this:

            Source: The Top Incomes Database

And look at the figure below,  which shows how the distribution of mean income and wealth (expressed in year 2009 prices) for the top 0.1% and the bottom 90% has changed in Sweden for the last 30 years: 

 
                  Source: The World Top Incomes Database

I would say the development in Sweden is also deeply problematic and going in the wrong direction. The main difference compared to UK and US is really that the increasing inequality in Sweden (going on continuously for 30 years now) started from a lower starting point.
The rising inequality has probably to do with income and wealth increasingly being concentrated in the hands of a very small and privileged elite – in Sweden as well as in the UK and the US.

Update (13/3): Jeser Roine, one of the researchers behind The World Top Incomes Database has an  interesting comment (in Swedish) on the debate here.

Så ökar vi läraryrkets attraktionskraft

9 March, 2012 at 18:27 | Posted in Education & School | 5 Comments

År efter år ser vi hur viljan att bli lärare minskar. I början på 1980-talet fanns det nästan åtta sökande per plats på lågstadielärarutbildningen. Idag är det en sökande per plats på grundlärarutbildningen. Detta är en samhällskatastrof som vi borde tala om. I en värld där allt hänger på kunskap är det på sikt avgörande för svensk ekonomi att åter göra läraryrket attraktivt.

I Sverige har levnadsstandarden mätt som per capitainkomst ökat med en faktor på över 50 sedan mitten av 1800-talet. Överlag är människor i västvärlden idag mer tjugo gånger rikare än vad de var för ett och ett halvt sekel sedan. Dess befolkning har en förväntad livstid som är nästan dubbelt så hög som förfädernas. Vad har skapat denna ökning i välfärd och levnadsstandard?

Kunskaper är ett slags instruktioner eller recept som talar om hur vi kan använda våra resurser för att producera nyttigheter. Med bättre kunskaper kan tillväxten öka även om de materiella resurserna är begränsade. Till skillnad från människor (med sina speciella färdigheter och utbildning) och ting (aktier, maskiner, naturtillgångar) styrs kunskap av stigande avkastning. Ett föremål (en portion mat) kan bara konsumeras av en enskild person vid ett tillfälle, medan kunskap (matreceptet) kan användas av hur många som helst när som helst.

Så kunskapsproduktion har en helt avgörande betydelse för skapandet av nationernas välfärd. Och om idéer och kunskaper spelar en så avgörande roll för långsiktig tillväxt och välfärd borde betydligt mer av samhällsdebatten handla om utbildningsstrategier, forskningssatsningar och lärarlöner.

En av de absolut viktigaste förklaringarna till den katastrofala utvecklingen vad avser sökande till lärarutbildningarna i Sverige är att lärarlönerna under lång tid har urholkats. Sveriges lärare ligger idag ligger lång ner på lönelistan inom OECD. Lärarnas Riksförbund kunde i den nyligen publicerade rapporten Läraryrkets attraktionskraft på fallrepet visa att köpkraftskorrigerade svenska lärarlöner idag är i nivå med lärarlönerna i Grekland och Portugal.

Ser vi på hur lärarnas reallöner utvecklats de senaste fyra decennierna är det egentligen märkligt att någon i det här landet över huvud kan tänka sig bli lärare:

                  Källa: LR, SCB och egna beräkningar.

I fjol presenterade Saco livslöneberäkningar av hur olika utbildningsval lönar sig sett över hela livet. Resultatet jämfördes med en genomsnittlig individ som börjar förvärvsarbeta direkt efter gymnasiet. Studien försökte skatta vad olika akademiska utbildningsval innebär i ekonomiska termer, med hänsynstagande till inkomstbortfall och skuldsättning under studietiden, inkomstskatter m m.

För lärare var det en – föga överraskande, men likväl – deprimerande läsning. Det visade sig nämligen att det är en ren förlustaffär att utbilda sig till lärare. För samtliga lärarutbildningar gäller att de ger en negativ avkastning – de har alltså sämre livslön än de som börjar jobba direkt efter gymnasiet istället för att skaffa sig en högskoleexamen.

Allra sämst löneutveckling har lärare i grundskolans tidigare år. När de går i pension har de tjänat åtta procent mindre än klasskamraterna som började arbeta direkt efter gymnasiestudierna.

Höjda lärarlöner är inte en tillräcklig förutsättning för att vi åter ska få en svensk skola av världsklass. Men det är en nödvändig förutsättning! Omfattande skolforskning har övertygande visat att det kommunala huvudmannaskapet är en av de viktigaste orsakerna bakom lärarlönernas och den svenska skolans kräftgång de senaste decennierna.

De politiska partierna måste droppa sina ideologiska skygglappar och inse att en och annan helig ko måste slaktas om vi ska få rätt på svensk skola. Folkpartiet insåg redan för nästan tio år sedan att när skolfakta sparkar så får man vara så god att ändra kurs – även om det eventuellt skulle stå i strid med ideologin. När ska övriga allianspartier och socialdemokratin våga ta det steget?

Ska läraryrkets attraktionskraft upp är höjda löner ett måste! Ska det vara så svårt att begripa? Det finns inga gratisluncher. Vill vi ha en svensk skola i världsklass så måste det få kosta. Höj lärarlönerna. För Sveriges ekonomi. Och för Sveriges framtid.

Istället för Ranelid et consortes

8 March, 2012 at 18:00 | Posted in Varia | Comments Off on Istället för Ranelid et consortes


Fan vad jag saknar dig, Fred!

Microfoundations – redux

8 March, 2012 at 12:21 | Posted in Economics, Theory of Science & Methodology | 2 Comments
Simon Wren-Lewis has a new post up on his blog today, where he tries to explain why he can’t agree with Paul Krugman’s
statement that
So as I see it, the whole microfoundations crusade is based on one predictive success some 35 years ago; there have been no significant payoffs since.
Why does Wren-Lewis disagree? He writes: 
I think the two most important microfoundation led innovations in macro have been intertemporal consumption and rational expectations. I have already talked about the former in an earlier post … [s]o let me focus on rational expectations …  [T]he adoption of rational expectations was not the result of some previous empirical failure. Instead it represented, as Lucas said, a consistency axiom …
I think macroeconomics today is much better than it was 40 years ago as a result of the microfoundations approach. I also argued in my previous post that a microfoundations purist position – that this is the only valid way to do macro – is a mistake. The interesting questions are in between. Can the microfoundations approach embrace all kinds of heterogeneity, or will such models lose their attractiveness in their complexity? Does sticking with simple, representative agent macro impart some kind of bias? Does a microfoundations approach discourage investigation of the more ‘difficult’ but more important issues? Might both these questions suggest a link between too simple a micro based view and a failure to understand what was going on before the financial crash? Are alternatives to microfoundations modelling methodologically coherent? Is empirical evidence ever going to be strong and clear enough to trump internal consistency? These are difficult and often quite subtle questions that any simplistic for and against microfoundations debate will just obscure. 
On this argumentation I would like to add the following comments:
 
(1) The fact that Lucas introduced rational expectatuions as a consistency axiom is not really an argument to why we should accept it as an acceptable assumption in a theory or model purporting to explain real macroeconomic process (see e. g. my article Robert Lucas, rational expectations, and the understanding of business cycles and my previous post on microundations here)
(2) “Now virtually any empirical claim in macro is contestable” Wren-Lewis writes. Yes, but so is also virtually any claim in macro (see e. g. my article When the model is the message – modern neoclassical economics).
(3) To the two questions “Can the microfoundations approach embrace all kinds of heterogeneity, or will such models lose their attractiveness in their complexity?” and “Does sticking with simple, representative agent macro impart some kind of bias?” I would unequivocally answer yes! I have given the reasons why e. g. in my article David Levine is totally wrong on the rational expectations hypothesis , so I will not repeat the argumentation here.
(4) “Are alternatives to microfoundations modelling methodologically coherent?” Well, I don’t know. But one thing I do  know, is that the kind of miocrofoundationalist macroeconomics that new classical economists in the vein of Lucas and Sargent and the so called new keynesian economists in the vein of Mankiw and Yellen are pursuing, are not methodologically coherent, as I have argued e. g. in my RWER-article What is (wrong with) economic theory? And that ought to be rather embarrassing for those ilks of macroeconomists to whom axiomatics and deductivity is the hallmark of science tout court.
 
So in the Wren-Lewis – Krugman discussion on microfoundations I think Krugman is closer to truth with his remark
[W]hat we call “microfoundations” are not like physical laws. Heck, they’re not even true.
 

Wren-Lewis, Noahpinion and Krugman on microfoundations of macroeconomics

7 March, 2012 at 14:43 | Posted in Economics, Theory of Science & Methodology | 1 Comment


.

Simon Wren-Lewis, Paul Krugman and Noah Smith (Noahpinion) have interesting posts up discussing if macroeconomics needs microfoundations.

Smith‘s view is that microfoundations “probably lead to better models” in the meaning “more useful for predicting the future.”

Wren-Lewis is more sceptical:

[S]uppose there is in fact more than one valid microfoundation for a particular aggregate model. In other words, there is not just one, but perhaps a variety of particular worlds which would lead to this set of aggregate macro relationships. (We could use an analogy, and say that these microfoundations were observationally equivalent in aggregate terms.) Furthermore, suppose that more than one of these particular worlds was a reasonable representation of reality. (Among this set of worlds, we cannot claim that one particular model represents the real world and the others do not.) It would seem to me that in this case the aggregate model derived from these different worlds has some utility beyond just one of these microfounded models. It is robust to alternative microfoundations. 

In these circumstances, it would seem sensible to go straight to the aggregate model, and ignore microfoundations.

Paul Krugman is also doubtful of the value of microfoundations:

[W]hat we call “microfoundations” are not like physical laws. Heck, they’re not even true. Maximizing consumers are just a metaphor, possibly useful in making sense of behavior, but possibly not. The metaphors we use for microfoundations have no claim to be regarded as representing a higher order of truth than the ad hoc aggregate metaphors we use in IS-LM or whatever; in fact, we have much more supportive evidence for Keynesian macro than we do for standard micro.

Yours truly basically side with Wren-Lewis and Krugman on this issue, but I will try to explain why one might be even more critical and doubtful than they are re microfoundations of macroeconomics.

Microfoundations today means more than anything else that you try to build macroeconomic models assuming “rational expectations” and “representative actors”. Both are highly questionable assumptions.

Rational expectations

The concept of rational expectations was first developed by John Muth (1961) and later applied to macroeconomics by Robert Lucas (1972). Those macroeconomic models building on rational expectations-microfoundations that are used today among both “new classical” and “new keynesian” macroconomists, basically assume that people on the average hold expectations that will be fulfilled. This makes the economist’s analysis enormously simplistic, since it means that the model used by the economist is the same as the one people use to make decisions and forecasts of the future.

Macroeconomic models building on rational expectations-microfoundations assume that people, on average, have the same expectations. Someone like Keynes for example, on the other hand, would argue that people often have different expectations and information, which constitutes the basic rational behind macroeconomic needs of coordination. Something that is rather swept under the rug by the extremely simple-mindedness of assuming rational expectations in representative actors models, which is so in vogue in “new classical” and “new keynesian” macroconomics. But if all actors are alike, why do they transact? Who do they transact with? The very reason for markets and exchange seems to slip away with the sister assumptions of representative actors and rational expectations.

Macroeconomic models building on rational expectations-microfoundations impute beliefs to the agents that is not based on any real informational considerations, but simply stipulated to make the models mathematically-statistically tractable. Of course you can make assumptions based on tractability, but then you do also have to take into account the necessary trade-off in terms of the ability to make relevant and valid statements on the intended target system. Mathematical tractability cannot be the ultimate arbiter in science when it comes to modeling real world target systems. One could perhaps accept macroeconomic models building on rational expectations-microfoundations  if they had produced lots of verified predictions and good explanations. But they have done nothing of the kind. Therefore the burden of proof is on those macroeconomists who still want to use models built on these particular unreal assumptions.

In macroeconomic models building on rational expectations-microfoundations –  where agents are assumed to have complete knowledge of all of the relevant probability distribution functions –  nothing really new happens, since they take for granted that people’s decisions can be portrayed as based on an existing probability distribution, which by definition implies the knowledge of every possible event (otherwise it is in a strict mathematical-statistically sense not really a probability distribution at all) that can be thought of taking place.

But in the real world, it is not possible to just assume that probability distributions are the right way to characterize, understand or explain acts and decisions made under uncertainty. When we simply do not know, when we have not got a clue, when genuine uncertainty prevail, macroeconomic models building on rational expectations-microfoundations simply will not do. In those circumstances it is not a useful assumption. The reason is that under those circumstances the future is not like the past, and henceforth, we cannot use the same probability distribution – if it at all exists – to describe both the past and future.

The future is not reducible to a known set of prospects. It is not like sitting at the roulette table and calculating what the future outcomes of spinning the wheel will be. We have to surpass macroeconomic models building on rational expectations-microfoundations and instead try to build economics on a more realistic foundation. A foundation that encompasses both risk and genuine uncertainty.

Macroeconomic models building on rational expectations-microfoundations emanates from the belief that to be scientific, economics has to be able to model individuals and markets in a stochastic-deterministic way. It’s like treating individuals and markets as the celestial bodies studied by astronomers with the help of gravitational laws. Unfortunately, individuals, markets and entire economies are not planets moving in predetermined orbits in the sky.

To deliver macroeconomic models building on rational expectations-microfoundations the economists have to constrain expectations on the individual and the aggregate level to be the same. If revisions of expectations take place they typically have to take in a known and pre-specified precise way. This squares badly with what we know to be true in real world, where fully specified trajectories of future expectations revisions are no-existent.

Further, most macroeconomic models building on rational expectations-microfoundations are time-invariant and so give no room for any changes in expectations and their revisions. The only imperfection of knowledge they admit of is included in the error terms, error terms that are assumed to be additive and to have a give and known frequency distribution, so that the models can still fully pre-specify the future even when incorporating these stochastic variables into the models.

In the real world there are many different expectations and these cannot be aggregated in macroeconomic models building on rational expectations-microfoundations without giving rise to inconsistency. This is one of the main reasons for these models being modeled as representative actors models. But this is far from being a harmless approximation to reality. Even the smallest differences of expectations between agents would make these models inconsistent, so when they still show up they have to be considered “irrational”.

It is not possible to adequately represent individuals and markets as having one single overarching probability distribution. Accepting that, does not imply that we have to end all theoretical endeavours and assume that all agents always act totally irrationally and only are analyzable within behavioural economics. Far from it. It means we acknowledge diversity and imperfection, and that macroeconomics has to be able to incorporate these empirical facts in its models.

Most models in science are representations of something else. Models “stand for” or “depict” specific parts of a “target system” (usually the real world). A model that has neither surface nor deep resemblance to important characteristics of real economies ought to be treated with prima facie suspicion. How could we possibly learn about the real world if there are no parts or aspects of the model that have relevant and important counterparts in the real world target system? The burden of proof lays on the macroeconomists thinking they have contributed anything of scientific relevance without even hinting at any bridge enabling us to traverse from model to reality. All theories and models have to use sign vehicles to convey some kind of content that may be used for saying something of the target system. But purpose-built assumptions made solely to secure a way of reaching deductively validated results in mathematical models, are of little value if they cannot be validated outside of the model.

All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is (no longer) the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

Theories are difficult to directly confront with reality. Economists therefore build models of their theories. Those models are representations that are directly examined and manipulated to indirectly say something about the target systems.

But being able to model a world that somehow could be considered real or similar to the real world is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

The microfounded macromodel should enable us to posit contrafactual questions about what would happen if some variable was to change in a specific way (hence the assumption of structural invariance, that purportedly enables the theoretical economist to do just that). But does it? Applying a “Lucas critique” on most microfounded macromodels, it is obvious that they fail. Changing “policy rules” cannot just be presumed not to influence investment and consumption behavior and a fortiori technology, thereby contradicting the invariance assumption. Technology and tastes cannot live up to the status of an economy’s deep and structurally stable Holy Grail. They too are part and parcel of an ever-changing and open economy.

Representative agents

Without export certificates models and theories should be considered unsold. Unfortunately this understanding has not informed modern neoclassical economics, as can be seen by the profuse use of so called representative-agent models.

A common feature of modern neoclassical macroeconomics is to use simple (dynamic stochastic) general equilibrium models where representative actors are supposed to have complete knowledge, zero transaction costs and complete markets.

In these models, the actors are all identical. Of course, this has far-reaching analytical implications. Situations characterized by asymmetrical information – situations most of us consider to be innumerable – cannot arise in such models. If the aim is to build a macro-analysis from micro-foundations in this manner, the relevance of the procedure is highly questionable (Robert Solow has even considered the claims made by protagonists of rational agent models “generally phony”).

One obvious critique is that representative-agent models do not incorporate distributional effects – effects that often play a decisive role in macroeconomic contexts. Investigations into the operations of markets and institutions usually find that there are overwhelming problems of coordination. These are difficult, not to say impossible, to analyze with the kind of Robinson Crusoe models that, e. g., real business cycle theorists employ and which exclude precisely those differences between groups of actors that are the driving force in many non-neoclassical analysis.

The choices of different individuals have to be shown to be coordinated and consistent. This is obviously difficult if the macroeconomic models don’t give room for heterogeneous individuals (this lack of understanding the importance of heterogeneity is perhaps especially problematic for the modeling of real business cycles in dynamic stochastic general equilibrium models). Representative-agent models are certainly more manageable, however, from a realist point of view, they are also less relevant and have a lower explanatory potential.

Both the “Lucas critique” and Keynes’ critique of econometrics argued that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” was not enough. If they could not get at the causal structure that generated the data, they were not really “identified”. Lucas himself drew the conclusion that the problem with unstable relations was to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies.

In microeconomics we know that aggregation really presupposes homothetic an identical preferences, something that almost never exist in real economies. The results given by these assumptions are therefore not robust and do not capture the underlying mechanisms at work in any real economy. And as if this was not enough, there are obvious problems also with the kind of microeconomic equilibrium that one tries to reduce macroeconomics to. Decisions of consumption and production are described as choices made by a single agent. But then, who sets the prices on the market? And how do we justify the assumption of universal consistency between the choices?

Models that are critically based on particular and odd assumptions – and are neither robust nor congruent to real world economies – are of questionable value.

And is it really possible to describe and analyze all the deliberations and choices made by individuals in an economy? Does not the choice of an individual presuppose knowledge and expectations about choices of other individuals? It probably does, and this presumably helps to explain why representative-agent models have become so popular in modern macroeconomic theory. They help to make the analysis more tractable.

One could justifiably argue that one might just as well accept that it is not possible to coherently reduce macro to micro, and accordingly that it is perhaps necessary to forswear microfoundations and the use of rational-agent models all together. Microeconomic reasoning has to build on macroeconomic presuppositions. Real individuals do not base their choices on operational general equilibrium models, but rather use simpler models. If macroeconomics needs microfoundations it is equally necessary that microeconomics needs macrofoundations.

The microeconomist Alan Kirman has maintained that the use of representative-agent models is unwarranted and leads to conclusions that are usually both misleading and false. It’s a fiction basically used by some macroeconomists to justify the use of equilibrium analysis and a kind of pseudo-microfoundations. Microeconomists are well aware that the conditions necessary to make aggregation to representative actors possible, are not met in actual economies. As economic models become increasingly complex, their use also becomes less credible.

Even if economies naturally presuppose individuals, it does not follow that we can infer or explain macroeconomic phenomena solely from knowledge of these individuals. Macroeconomics is to a large extent emergent and cannot be reduced to a simple summation of micro-phenomena. Moreover, as we have already argued, even these microfoundations aren’t immutable. Lucas and the new classical economists’ deep parameters – “tastes” and “technology” – are not really the bedrock of constancy that they believe (pretend) them to be.

For Alfred Marshall economic theory was “an engine for the discovery of concrete truth”. But where Marshall tried to describe the behaviour of a typical business with the concept “representative firm”, his modern heirs don’t at all try to describe how firms interplay with other firms in an economy. The economy is rather described “as if” consisting of one single giant firm – either by inflating the optimization problem of the individual to the scale of a whole economy, or by assuming that it’s possible to aggregate different individuals’ actions by a simple summation, since every type of actor is identical. But do not we just have to face that it is difficult to describe interaction and cooperation when there is essentially only one actor?

Conclusion

Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. But there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place macroeconomic models building on representative actors and rational expectations-microfoundations where they belong – in the dustbin of history.

For if this microfounded macroeconomics has nothing to say about the real world and the economic problems out there, why should we care about it? The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than hand waving that give us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

So, these are my arguments for why I think that Simon Wren-Lewis, Paul Krugman and all other macroeconomists ought to be even more critical of the microfoundationists than they are. If macroeconomic models – no matter of what ilk –  build on microfoundational assumptions of representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Incompatibility between actual behaviour and the behaviour in macroeconomic models building on representative actors and rational expectations-microfoundations is not a symptom of “irrationality”. It rather shows the futility of trying to represent real-world target systems with models flagrantly at odds with reality.

Next Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.