Why game theory will be nothing but a footnote in the history of social science

30 September, 2017 at 12:35 | Posted in Economics | Comments Off on Why game theory will be nothing but a footnote in the history of social science

Nash equilibrium has since it was introduced back in the 50’s come to be the standard solution concept used by game theorists. The justification for its use has mainly built on dubious and contentious assumptions like ‘common knowledge’ and individuals exclusively identified as instrumentally rational. And as if that wasn’t enough, one actually, to ‘save’ the Holy Equilibrium Grail, has had to further make the ridiculously unreal assumption that those individuals have ‘consistently aligned beliefs’ — effectively treating different individuals as incarnations of the microfoundationalist ‘representative agent.’

In the beginning — in the 50’s and 60’s — hopes were high that game theory would enhance our possibilities of understanding/explaining the behaviour of interacting actors in non-parametric settings. And this is where we ended up! A sad story, indeed, showing the limits of methodological individualism and instrumental rationality.

Why not give up on the Nash concept altogether? Why not give up the vain dream of trying to understand social interaction by reducing it to something that can be analyzed within a grotesquely unreal model of instrumentally interacting identical robot imitations?

We believe that a variety of contributory factors can be identified …

gaIt is possible that the strange philosophical moorings of neoclassical economics and game theory have played a part. They are strange in at least two respects. The first is a kind of amnesia or lobotomy which the discipline seems to have suffered regarding most things philosophical during the postwar period … The second is the utilitarian historical roots of modern economics … Thirdly, the sociology of the discipline may provide further clues … All academics have fought their corner in battles over resources and they always use the special qualities of their discipline as ammunition in one way or another. Thus one might explain in functionalist terms the mystifying attachments of economics and game theory to Nash.

Half a century ago there was​ widespread hopes game theory would provide a unified theory of all social science. Today it has become obvious those hopes did not materialize. This ought to come as no surprise. Reductionist and atomistic models of social interaction — such as the ones neoclassical mainstream economics is founded on — can never deliver good building blocks for a realist and relevant social science.

Advertisements

Rational expectations — the triumph of ideology over science

29 September, 2017 at 18:53 | Posted in Economics | 2 Comments

Research shows not only that individuals sometimes act differently than standard economic theories predict, but that they do so regularly, systematically, and in ways that can be understood and interpreted through alternative hypotheses, competing with those utilised by orthodox economists.

Senate Banking Subcommittee On Financial Institutions Hearing With StiglitzTo most market participants — and, indeed, ordinary observers — this does not seem like big news … In fact, this irrationality is no news to the economics profession either. John Maynard Keynes long ago described the stock market as based not on rational individuals struggling to uncover market fundamentals, but as a beauty contest in which the winner is the one who guesses best what the judges will say …

Adam Smith’s invisible hand — the idea that free markets lead to efficiency as if guided by unseen forces — is invisible, at least in part, because it is not there …

For more than 20 years, economists were enthralled by so-called “rational expectations” models which assumed that all participants have the same (if not perfect) information and act perfectly rationally, that markets are perfectly efficient, that unemployment never exists (except when caused by greedy unions or government minimum wages), and where there is never any credit rationing.

That such models prevailed, especially in America’s graduate schools, despite evidence to the contrary, bears testimony to a triumph of ideology over science. Unfortunately, students of these graduate programmes now act as policymakers in many countries, and are trying to implement programmes based on the ideas that have come to be called market fundamentalism … Good science recognises its limitations, but the prophets of rational expectations have usually shown no such modesty.

Joseph Stiglitz

 
Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. As yours truly has tried to show in On the use and misuse of theories and models in economics there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place macroeconomic models building on representative actors and rational expectations microfoundations in the dustbin of pseudo-science.

For if this microfounded macroeconomics has nothing to say about the real world and the economic problems out there, why should we care about it? The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than hand-waving that give us a rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

The real macroeconomic challenge is to accept uncertainty and still try to explain why economic transactions take place — instead of simply conjuring the problem away by assuming rational expectations and treating uncertainty as if it was possible to reduce it to stochastic risk. That is scientific cheating. And it has been going on for too long now.

In Dreams

29 September, 2017 at 16:15 | Posted in Varia | 1 Comment


In loving memory of Kristina — beloved wife and mother of David and Tora.

Like Tears in Rain

29 September, 2017 at 15:34 | Posted in Varia | Comments Off on Like Tears in Rain

 

Hicks on neoclassical ‘uncertainty laundering’

28 September, 2017 at 09:56 | Posted in Economics | Comments Off on Hicks on neoclassical ‘uncertainty laundering’

To understand real world ‘non-routine’ decisions and unforeseeable changes in behaviour, ergodic probability distributions are of no avail. In a world full of genuine uncertainty — where real historical time rules the roost — the probabilities that ruled the past are not necessarily those that will rule the future.

Wickham, Mark, active 1984-2000; Sir John Hicks (1904-1989)When we cannot accept that the observations, along the time-series available to us, are independent … we have, in strict logic, no more than one observation, all of the separate items having to be taken together. For the analysis of that the probability calculus is useless; it does not apply … I am bold enough to conclude, from these considerations that the usefulness of ‘statistical’ or ‘stochastic’ methods in economics is a good deal less than is now conventionally supposed … We should always ask ourselves, before we apply them, whether they are appropriate to the problem in hand. Very often they are not … The probability calculus is no excuse for forgetfulness.

John Hicks, Causality in Economics, 1979:121

Time to abandon statistical significance

27 September, 2017 at 10:55 | Posted in Statistics & Econometrics | 6 Comments

worship-p-300x214We recommend dropping the NHST [null hypothesis significance testing] paradigm — and the p-value thresholds associated with it — as the default statistical paradigm for research, publication, and discovery in the biomedical and social sciences. Specifically, rather than allowing statistical signicance as determined by p < 0.05 (or some other statistical threshold) to serve as a lexicographic decision rule in scientic publication and statistical decision making more broadly as per the status quo, we propose that the p-value be demoted from its threshold screening role and instead, treated continuously, be considered along with the neglected factors [such factors as prior and related evidence, plausibility of mechanism, study design and data quality, real world costs and benefits, novelty of finding, and other factors that vary by research domain] as just one among many pieces of evidence.

We make this recommendation for three broad reasons. First, in the biomedical and social sciences, the sharp point null hypothesis of zero effect and zero systematic error used in the overwhelming majority of applications is generally not of interest because it is generally implausible. Second, the standard use of NHST — to take the rejection of this straw man sharp point null hypothesis as positive or even definitive evidence in favor of some preferredalternative hypothesis — is a logical fallacy that routinely results in erroneous scientic reasoning even by experienced scientists and statisticians. Third, p-value and other statistical thresholds encourage researchers to study and report single comparisons rather than focusing on the totality of their data and results.

Andrew Gelman et al.

ad11As shown over and over again when significance tests are applied, people have a tendency to read ‘not disconfirmed’ as ‘probably confirmed.’ Standard scientific methodology tells us that when there is only say a 10 % probability that pure sampling error could account for the observed difference between the data and the null hypothesis, it would be more ‘reasonable’ to conclude that we have a case of disconfirmation. Especially if we perform many independent tests of our hypothesis and they all give about the same 10 % result as our reported one, I guess most researchers would count the hypothesis as even more disconfirmed.

We should never forget that the underlying parameters we use when performing significance tests are model constructions. Our p-values mean nothing if the model is wrong. And most importantly — statistical significance tests DO NOT validate models!

statistical-models-sdl609573791-1-42fd0In journal articles a typical regression equation will have an intercept and several explanatory variables. The regression output will usually include an F-test, with p – 1 degrees of freedom in the numerator and n – p in the denominator. The null hypothesis will not be stated. The missing null hypothesis is that all the coefficients vanish, except the intercept.

If F is significant, that is often thought to validate the model. Mistake. The F-test takes the model as given. Significance only means this: if the model is right and the coefficients are 0, it is very unlikely to get such a big F-statistic. Logically, there are three possibilities on the table:
i) An unlikely event occurred.
ii) Or the model is right and some of the coefficients differ from 0.
iii) Or the model is wrong.
So?

Neoliberal ‘ethics’

26 September, 2017 at 13:20 | Posted in Politics & Society | 3 Comments

As we all know, neoliberalism is nothing but a self-serving con endorsing pernicious moral cynicism. But it’s still sickening to read its gobsmacking trash, maintaining that unregulated capitalism is a ‘superlatively moral system’:

neo The rich man may feast on caviar and champagne, while the poor woman starves at his gate. And she may not even take the crumbs from his table, if that would deprive him of his pleasure in feeding them to his birds.
 

David Gauthier Morals by Agreement

Now, compare that unashamed neoliberal apologetics with what two truly great economists and liberals — John Maynard Keynes and Robert Solow — have to say:

The outstanding faults of the economic society in which we live are its failure to provide for full employment and its arbitrary and inequitable distribution of wealth and incomes … I believe that there is social and psychological justification for significant inequalities of income and wealth, but not for such large disparities as exist to-day.

John Maynard Keynes General Theory (1936)

4703325Who could be against allowing people their ‘just deserts?’ But there is that matter of what is ‘just.’ Most serious ethical thinkers distinguish between deservingness and happenstance. Deservingness has to be rigorously earned. You do not ‘deserve’ that part of your income that comes from your parents’ wealth or connections or, for that matter, their DNA. You may be born just plain gorgeous or smart or tall, and those characteristics add to the market value of your marginal product, but not to your deserts. It may be impractical to separate effort from happenstance numerically, but that is no reason to confound them, especially when you are thinking about taxation and redistribution. That is why we want to temper the wind to the shorn lamb, and let it blow on the sable coat.

Robert Solow Journal of Economic Perspectives (2014)

Public debt — an economic necessity

26 September, 2017 at 09:06 | Posted in Economics | 6 Comments

wvickreyWe are not going to get out of the economic doldrums as long as we continue to be obsessed with the unreasoned ideological goal of reducing the so-called deficit. The “deficit” is not an economic sin but an economic necessity. […]

The administration is trying to bring the Titanic into harbor with a canoe paddle, while Congress is arguing over whether to use an oar or a paddle, and the Perot’s and budget balancers seem eager to lash the helm hard-a-starboard towards the iceberg. Some of the argument seems to be over which foot is the better one to shoot ourselves in. We have the resources in terms of idle manpower and idle plants to do so much, while the preachers of austerity, most of whom are in little danger of themselves suffering any serious consequences, keep telling us to tighten our belts and refrain from using the resources that lay idle all around us.

Alexander Hamilton once wrote “A national debt, if it be not excessive, would be for us a national treasure.” William Jennings Bryan used to declaim, “You shall not crucify mankind upon a cross of gold.” Today’s cross is not made of gold, but is concocted of a web of obfuscatory financial rectitude from which human values have been expunged.

William Vickrey

Right-wing extremist party in German parliament for the first time since WWII

25 September, 2017 at 13:57 | Posted in Politics & Society | Comments Off on Right-wing extremist party in German parliament for the first time since WWII

 

Trump’s ‘alternative facts’ exported to German politics …

Seven sins of economics

23 September, 2017 at 11:44 | Posted in Economics | 4 Comments

There has always been some level of scepticism about the ability of economists to offer meaningful predictions and prognosis about economic and social phenomenon. That scepticism has heightened in the wake of the global financial crisis, leading to what is arguably the biggest credibility crisis the discipline has faced in the modern era.

assumpSome of the criticisms against economists are misdirected. But the major thrust of the criticisms does have bite.

There are seven key failings, or the ‘seven sins’, as I am going to call them, that have led economists to their current predicament. These include sins of commission as well as sins of omission.

Sin 1: Alice in Wonderland assumptions

The problem with economists is not that they make assumptions. After all, any theory or model will have to rely on simplifying assumptions … But when critical assumptions are made just to circumvent well-identified complexities in the quest to build elegant theories, such theories will simply end up being elegant fantasies.

Sin 2: Abuse of modelling

What compounds the sin of wild assumptions is the sin of careless modelling, and then selling that model as if it were a true depiction of an economy or society …

Sin 3: Intellectual capture

Several post-crisis assessments of the economy and of economics have pointed to intellectual capture as a key reason the profession, as a whole failed, to sound alarm bells about problems in the global economy, and failed to highlight flaws in the modern economic architecture …

Sin 4: The science obsession

The excessive obsession in the discipline to identify itself as science has been costly. This has led to a dangerous quest for standardization in the profession, leading many economists to mistake a model of the economy for ‘the model’ of the economy …

The science obsession has diminished the diversity of the profession, and arguably allowed complacency to take root in the run-up to the global financial crisis …

Sin 5: Perpetuating the myth of ‘the textbook’ and Econ 101

The quest for standardization has also led to an astonishing level of uniformity in the manner in which economists are trained, and in the manner in which economists train others. Central to this exercise are textbooks that help teach the lessons of ‘Econ 101’—lessons disconnected from reality as they are from the frontiers of economic research …

Sin 6: Ignoring society

What makes Econ 101 and a lot of mainstream economics particularly limiting is its neglect of the role of culture and social norms in determining economic outcomes even though classical economists such as Adam Smith and Karl Marx took care to emphasize how social norms and social interactions shape economic outcomes …

Economists typically don’t engage with other social sciences, even though insights from those disciplines have a direct bearing on the subjects of economic enquiry …

Sin 7: Ignoring history

One way in which economists could have compensated for the lack of engagement with other social sciences is by studying economic history. After all, studying economic history carefully can help us understand the social and institutional contexts in which particular economic models worked, or did not work …

But economic history has been relegated to the margins over the past several years, and many graduate students remain unacquainted with the subject still.

Pramit Bhattacharya

Game theory and the shaping of neoliberal capitalism

21 September, 2017 at 17:00 | Posted in Economics | 9 Comments

Prisoners_of_Reason Neoliberal subjectivity arises from the intricate pedagogy of game theory that comes to the fore in the Prisoner’s Dilemma game and is interchangeable with contemporary paradigmatic instrumental rationality. Rational choice is promoted as an exhaustive science of decision making, but only by smuggling in a characteristic​ confusion​ suggesting​ that everything of value​ to agents can be reflected in their appraisal of existential worth even though this is patently not the case in life viewed as a ‘fixed game.’ Without a critical and scrupulous pedagogy that carefully identifies as optional the assumptions necessary to operationalize strategic rationality, a new neoliberal understanding of capitalism will dominate the worldview​ of the student of game theory and inhabitant of neoliberal institutions.

When criticising game theory you often get the rather uninformative and vacuous answer that we all have to remember that game theory — as is mainstream neoclassical theory at large — is nothing but ‘as-if-theory’ built on ‘as-if-rationality.’ As Ariel Rubinstein has it, however, this only shows that “the phrase ‘as if’ is a way to avoid taking responsibility for the strong assumptions upon which economic models are founded” …

Missing the point — the quantitative ambitions of DSGE models

19 September, 2017 at 17:00 | Posted in Economics | Comments Off on Missing the point — the quantitative ambitions of DSGE models

A typical modern approach to writing a paper in DSGE macroeconomics is as follows:

o to establish “stylized facts” about the quantitative interrelationships of certain macroeconomic variables (e.g. moments of the data such as variances, autocorrelations, covariances, …) that have hitherto not been jointly explained;

o to write down a DSGE model of an economy subject to a defined set of shocks that aims to capture the described interrelationships; and

o to show that the model can “replicate” or “match” the chosen moments when it is fed with stochastic shocks generated by the assumed shock process …

reality-check_600_441_80However, the test imposed by matching DSGE models to the data is problematic in at least three respects:

First, the set of moments chosen to evaluate the model is largely arbitrary …

Second, for a given set of moments, there is no well-defined statistic to measure the goodness of fit of a DSGE model or to establish what constitutes an improvement in such a framework …

Third, the evaluation is complicated by the fact that, at some level, all economic models are rejected by the data … In addition, DSGE models frequently impose a number of restrictions that are in direct conflict with micro evidence. If a model has been rejected along some dimensions, then a statistic that measures the goodness-of-fit along other dimensions is meaningless …

Focusing on the quantitative fit of models also creates powerful incentives for researchers (i) to introduce elements that bear little resemblance to reality for the sake of achieving a better fit (ii) to introduce opaque elements that provide the researcher with free (or almost free) parameters and (iii) to introduce elements that improve the fit for the reported moments but deteriorate the fit along other unreported dimensions.

Albert Einstein observed that “not everything that counts can be counted, and not everything that can be counted counts.” DSGE models make it easy to offer a wealth of numerical results by following a well-defined set of methods (that requires one or two years of investment in graduate school, but is relatively straightforward to apply thereafter). There is a risk for researchers to focus too much on numerical predictions of questionable reliability and relevance that absorb a lot of time and effort rather than focusing on deeper conceptual questions that are of higher relevance for society.

Anton Korinek

Great essay, showing that ‘rigorous’ and ‘precise’ DSGE models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimetre if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not say anything about real-world economies.

Proving things ‘rigorously’ in DSGE models is at most a starting point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.

Mainstream economists think there is a gain from the DSGE style of modelling in its capacity to offer some kind of structure around which to organise discussions. To me, that sounds more like a religious theoretical-methodological dogma, where one paradigm rules in divine hegemony. That’s not progress. That’s the death of economics as a science.

As Kornik argues, using DSGE models “creates a bias towards models that have a well-behaved ergodic steady state.” Since we know that most real-world processes do not follow an ergodic distribution, this is, to say the least, problematic. To understand real world ‘non-routine’ decisions and unforeseeable changes in behaviour, stationary probability distributions are of no avail. In a world full of genuine uncertainty — where real historical time rules the roost — the probabilities that ruled the past are not those that will rule the future. Imposing invalid probabilistic assumptions on the data make all DSGE models statistically misspecified.

Advocates of DSGE modelling want to have deductively automated answers to fundamental causal questions. But to apply ‘thin’ methods we have to have ‘thick’ background knowledge of what’s going on in the real world, and not in idealized models. Conclusions can only be as certain as their premises — and that also applies to the quest for causality and forecasting predictability in DSGE models.

If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized that have to match reality, not the other way around. The modelling convention used when constructing DSGE models makes it impossible to fully incorporate things that we know are of paramount importance for understanding modern economies — such as income and wealth inequality, asymmetrical power relations and information, liquidity preference, just to mention a few.

Given all these fundamental problems for the use of these models and their underlying methodology, it is beyond understanding how the DSGE approach has come to be the standard approach in ‘modern’ macroeconomics. DSGE models are based on assumptions profoundly at odds with what we know about real-world economies. That also makes them little more than overconfident story-telling devoid of real scientific value. Macroeconomics would do much better with more substantive diversity and plurality.

Dangers of ‘running with the mainstream pack’

18 September, 2017 at 14:43 | Posted in Economics | 3 Comments


An absolutely fabulous speech — and Soskice and Carlin’s textbook Macroeconomics: Institutions, Instability, and the Financial System — that Dullien mentions at the beginning of his speech — is really a very good example of the problems you run into if you want to be ‘pluralist’ within the mainstream pack.

wendyCarlin and Soskice explicitly adapts a ‘New Keynesian’ framework including price rigidities and adding a financial system to the usual neoclassical macroeconomic set-up. But although I find things like the latter amendment an improvement, it’s definitely more difficult to swallow their methodological stance, and especially their non-problematized acceptance of the need for macroeconomic microfoundations.

Some months ago, another sorta-kinda ‘New Keynesian’, Paul Krugman, argued on his blog that the problem with the academic profession is that some macroeconomists aren’t “bothered to actually figure out” how the New Keynesian model with its Euler conditions — “based on the assumption that people have perfect access to capital markets, so that they can borrow and lend at the same rate” — really works. According to Krugman, this shouldn’t be hard at all — “at least it shouldn’t be for anyone with a graduate training in economics.”

Carlin & Soskice seem to share Krugman’s attitude. From the first page of the book, they start to elaborate their preferred 3-equations ‘New Keynesian’ macromodel. And after twenty-two pages, they have already come to specifying the demand side with the help of the Permanent Income Hypothesis and its Euler equations.

But if people — not the representative agent — at least sometimes can’t help being off their labour supply curve — as in the real world — then what are these hordes of Euler equations that you find ad nauseam in these ‘New Keynesian’ macromodels gonna help us? Yours truly’s doubts regarding the ‘New Keynesian’ modellers ‘ obsession with Euler equations is basically that, as with so many other assumptions in ‘modern’ macroeconomics, the Euler equations don’t fit reality.

All empirical sciences use simplifying or unrealistic assumptions in their modelling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

If we cannot show that the mechanisms or causes we isolate and handle in our models are stable, in the sense that what when we export them from our models to our target systems they do not change from one situation to another, then they only hold under ceteris paribus conditions and a fortiori are of limited value for our understanding, explanation and prediction of our real world target system.

No matter how many convoluted refinements of concepts made in the model, if the “successive approximations” do not result in models similar to reality in the appropriate respects (such as structure, isomorphism etc), the surrogate system becomes a substitute system that does not bridge to the world but rather misses its target.

From this methodological perspective, yours truly has to conclude that Carlin’s and Soskice’s microfounded macroeconomic model is a rather unimpressive attempt at legitimizing using fictitious idealizations — such as Euler equations — for reasons more to do with model tractability than with a genuine interest in understanding and explaining features of real economies.

Running with the mainstream pack is not a good idea if you want to develop realist and relevant economics.

Putting predictions to the test

17 September, 2017 at 11:58 | Posted in Economics | 1 Comment

tetIt is the somewhat gratifying lesson of Philip Tetlock’s new book that people who make prediction their business — people who appear as experts on television, get quoted in newspaper articles, advise governments and businesses, and participate in punditry roundtables — are no better than the rest of us. When they’re wrong, they’re rarely held accountable, and they rarely admit it, either. They insist that they were just off on timing, or blindsided by an improbable event, or almost right, or wrong for the right reasons. They have the same repertoire of self-justifications that everyone has, and are no more inclined than anyone else to revise their beliefs about the way the world works, or ought to work, just because they made a mistake. No one is paying you for your gratuitous opinions about other people, but the experts are being paid, and Tetlock claims that the better known and more frequently quoted they are, the less reliable their guesses about the future are likely to be. The accuracy of an expert’s predictions actually has an inverse relationship to his or her self-confidence, renown, and, beyond a certain point, depth of knowledge. People who follow current events by reading the papers and newsmagazines regularly can guess what is likely to happen about as accurately as the specialists whom the papers quote. Our system of expertise is completely inside out: it rewards bad judgments over good ones.

The New Yorker

Mainstream neoclassical economists often maintain – usually referring to the methodological individualism of Milton Friedman — that it doesn’t matter if the assumptions of the models they use are realistic or not. What matters is if the predictions are right or not. But, if so, then the only conclusion we can make is — throw away the garbage! Because, oh dear, oh dear, how wrong they have been!

When Simon Potter a couple of years ago analyzed the predictions that the Federal Reserve Bank of New York did on the development of real GDP and unemployment for the years 2007-2010, it turned out that the predictions were wrong with respectively 5.9% and 4.4% — which is equivalent to 6 millions of unemployed:

Economic forecasters never expect to predict precisely. One way of measuring the accuracy of their forecasts is against previous forecast errors. When judged by forecast error performance metrics from the macroeconomic quiescent period that many economists have labeled the Great Moderation, the New York Fed research staff forecasts, as well as most private sector forecasts for real activity before the Great Recession, look unusually far off the mark …

Using a similar approach to Reifschneider and Tulip but including forecast errors for 2007, one would have expected that 70 percent of the time the unemployment rate in the fourth quarter of 2009 should have been within 0.7 percentage point of a forecast made in April 2008. The actual forecast error was 4.4 percentage points, equivalent to an unexpected increase of over 6 million in the number of unemployed workers. Under the erroneous assumption that the 70 percent projection error band was based on a normal distribution, this would have been a 6 standard deviation error, a very unlikely occurrence indeed.

In other words — the “rigorous” and “precise” macroeconomic mathematical-statistical forecasting models were wrong. And the rest of us have to pay.

Potter is not the only one who lately has criticized the forecasting business. John Mingers comes to essentially the same conclusion when scrutinizing it from a somewhat more theoretical angle:

It is clearly the case that experienced modellers could easily come up with significantly different models based on the same set of data thus undermining claims to researcher-independent objectivity. This has been demonstrated empirically by Magnus and Morgan (1999) who conducted an experiment in which an apprentice had to try to replicate the analysis of a dataset that might have been carried out by three different experts (Leamer, Sims, and Hendry) following their published guidance. In all cases the results were different from each other, and different from that which would have been produced by the expert, thus demonstrating the importance of tacit knowledge in statistical analysis.

The empirical and theoretical evidence is clear. Predictions and forecasts are inherently difficult to make in a socio-economic domain where genuine uncertainty and unknown unknowns often rule the roost. The real processes that underly the time series that economists use to make their predictions and forecasts do not conform with the assumptions made in the applied statistical and econometric models. Much less is a fortiori predictable than standardly — and uncritically — assumed. The forecasting models fail to a large extent because the kind of uncertainty that faces humans and societies actually makes the models strictly seen inapplicable. The future is inherently unknowable — and using statistics, econometrics, decision theory or game theory, does not in the least overcome this ontological fact. The economic future is not something that we normally can predict in advance. Better then to accept that as a rule “we simply do not know.”

In New York State, Section 899 of the Code of Criminal Procedure provides that persons “Pretending to Forecast the Future” shall be considered disorderly under subdivision 3, Section 901 of the Code and liable to a fine of $250 and/or six months in prison. Although the law does not apply to “ecclesiastical bodies acting in good faith and without fees,” I’m not sure where that leaves macroeconomic model-builders and other forecasters.

The accuracy of the predictions that experts make certainly seem to have an inverse relationship to their self-confidence. Being cocky and wrong is a lethal combination — and economists are often wrong and hardly known for being particularly modest people …

The growth of the Internet will slow drastically, as the flaw in “Metcalfe’s law”–which states that the number of potential connections in a network is proportional to the square of the number of participants–becomes apparent: most people have nothing to say to each other! By 2005 or so, it will become clear that the Internet’s impact on the economy has been no greater than the fax machine’s.

Paul Krugman

Sign Your Name

16 September, 2017 at 22:09 | Posted in Varia | Comments Off on Sign Your Name

 

Nights in White Satin

16 September, 2017 at 19:22 | Posted in Varia | Comments Off on Nights in White Satin

 

Economic growth and gender

16 September, 2017 at 17:59 | Posted in Economics | 2 Comments

Wealth_And_Poverty_Of_NationsThe economic implications of gender discrimination are most serious. To deny women is to deprive a country of labor​ and talent, but — even worse — to undermine the drive to achievement of boys and men. One cannot rear young people in such wise that half of them think themselves superior by biology, without dulling ambition and devaluing accomplishment … To be sure, any society will have its achievers no matter what, if only because it has its own division of tasks and spoils. But it cannot compete with other societies that ask performance from the full pool of talent.

In general, the best clue to a nation’s growth and development potential is the status and role of women. This is the greatest handicap of Muslim Middle Eastern societies today, the flaw that most bars them from modernity.

Any economist who thinks that growth and development have​ little or nothing to do with cultural and religious imperatives ought to read Landes’ masterful survey of what makes some countries so rich and others so poor.

Wiener Kaffeehäuser (personal)

16 September, 2017 at 17:36 | Posted in Varia | Comments Off on Wiener Kaffeehäuser (personal)

Back in the 80’s yours truly had the pleasure of studying German at Universität Wien.  I’ve been back in Vienna a couple of times since then. A wonderful town full of history — and Kaffeehäuser!

Stiglitz and the full force of Sonnenschein-Mantel-Debreu

16 September, 2017 at 15:43 | Posted in Economics | 1 Comment

In his recent article on Where Modern Macroeconomics Went Wrong, Joseph Stiglitz acknowledges that his approach “and that of DSGE models begins with the same starting point: the competitive equilibrium model of Arrow and Debreu.”

This is probably also the reason why Stiglitz’ critique doesn’t go far enough.

It’s strange that mainstream macroeconomists still stick to a general equilibrium paradigm more than forty years after the Sonnenschein-Mantel-Debreu theorem — SMD — devastatingly showed that it  is an absolute non-starter for building realist and relevant macroeconomics:

SMD theory means that assumptions guaranteeing good behavior at the microeconomic level do not carry over to the aggregate level or to qualitative features of the equilibrium …

24958274Given how sweeping the changes wrought by SMD theory seem to be, it is understandable that some very broad statements about the character of general equilibrium theory were made. Fifteen years after General Competitive Analysis, Arrow (1986) stated that the hypothesis of rationality had few implications at the aggregate level. Kirman (1989) held that general equilibrium theory could not generate falsifiable propositions, given that almost any set of data seemed consistent with the theory …

S. Abu Turab Rizvi

New Classical-Real Business Cycles-DSGE-New Keynesian microfounded macromodels try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And — worse still — something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for since SMD unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee either stability or uniqueness of the equilibrium solution.

Opting for cloned representative agents that are all identical is of course not a real solution to the fallacy of composition that SMD points to. Representative agent models are — as I have argued at length here — rather an evasion whereby issues of distribution, coordination, heterogeneity — everything that really defines macroeconomics — are swept under the rug.

Of course, most macroeconomists know that to use a representative agent is a flagrantly illegitimate method of ignoring real aggregation issues. They keep on with their business, nevertheless, just because it significantly simplifies what they are doing. It reminds — not so little — of the drunkard who has lost his keys in some dark place and deliberately chooses to look for them under a neighbouring street light just because it is easier to see there.

Frank-Ackerman_0General equilibrium is fundamental to economics on a more normative level as well. A story about Adam Smith, the invisible hand, and the merits of markets pervades introductory textbooks, classroom teaching, and contemporary political discourse. The intellectual foundation of this story rests on general equilibrium, not on the latest mathematical excursions. If the foundation of everyone’s favourite economics story is now known to be unsound — and according to some, uninteresting as well — then the profession owes the world a bit of an explanation.

Frank Ackerman

Almost a century and a half after Léon Walras founded general equilibrium theory, economists still have not been able to show that markets lead economies to equilibria. We do know that — under very restrictive assumptions — equilibria do exist, are unique and are Pareto-efficient. But — what good does that do? As long as we cannot show that there are convincing reasons to suppose there are forces which lead economies to equilibria — the value of general equilibrium theory is nil. As long as we cannot really demonstrate that there are forces operating — under reasonable, relevant and at least mildly realistic conditions — at moving markets to equilibria, there cannot really be any sustainable reason for anyone to pay any interest or attention to this theory.

A stability that can only be proved by assuming Santa Claus conditions is of no avail. Most people do not believe in Santa Claus anymore. And for good reasons — Santa Claus is for kids.

Continuing to model a world full of agents behaving as economists — ‘often wrong, but never uncertain’ — and still not being able to show that the system under reasonable assumptions converges to equilibrium (or simply assume the problem away), is a gross misallocation of intellectual resources and time.

 kirman

The full force of the Sonnenschein, Mantel, and Debreu (SMD) result is often not appreciated. Without stability or uniqueness, the intrinsic interest of economic analysis based on the general equilibrium model is extremely limited …

The usual way out of this problem is to assume a “representative agent,” and this obviously generates a unique equilibrium. However, the assumption of such an individual is open to familiar criticisms (Kirman 1992; Stoker 1995), and recourse to this creature raises one of the basic problems encountered on the route to the place where general equilibrium has found itself: the problem of aggregation. In fact, we know that, in general, there is no simple relation between individual and aggregate behavior, and to assume that behavior at one level can be assimilated to that at the other is simply erroneous …

The very fact that we observe, in reality, increasing amounts of resources being devoted to informational acquisition and processing implies that the standard general equilibrium model and the standard models of financial markets are failing to capture important aspects of reality.

Alan Kirman

9/11 & 11/9

14 September, 2017 at 20:03 | Posted in Politics & Society | 2 Comments

hanks

Skolverket och segregationen

14 September, 2017 at 16:12 | Posted in Education & School | Comments Off on Skolverket och segregationen

Ett av Skolkommissionens övergripande förslag är att i skollagen ange att skolans huvudmän ska verka för en allsidig social sammansättning. Efter att ha konstaterat att utvecklingen mot ökade socioekonomiska skillnader mellan skolor är mycket oroande, avstyrker emellertid Skolverket förslaget. monkeyOrsaken är att det anses oklart vad ”allsidig social sammansättning” innebär. Det kan man anse, men då måste man också undra om inte även Skolverkets oro för ökade socioekonomiska skillnader vilar på oklar grund.

På grund av denna upplevda oklarhet anser Skolverket förslaget vara rättsosäkert och potentiellt diskriminerande … Det är svårt att tolka detta på annat sätt än att Skolverkes avfärdar hela idén om att det ska ligga i skolväsendets uppdrag att motverka den sociala skolsegregationen …

Skolverkets remissvar är verkligen häpnadsväckande. Myndigheten motsätter sig samtliga konkreta förslag till ändringar av dagens system för elevurval och detta utan att komma med några egna åtgärdsförslag. Inte ens några förslag till hur kommissionens åtgärder skulle kunna modifieras står att finna. Man motsätter sig dessutom att överhuvudtaget ge huvudmännen uppdraget att bry sig om den sociala skolsegregationen. Hållningen hade varit begriplig om Skolverket inte sett skolsegregationen som ett problem men så är inte fallet; myndighetens nye generaldirektör har till och med utnämnt den till en ödesfråga. Det märks inte i Skolverkets argumentation.

Jonas Vlachos

Where modern macroeconomics went wrong

14 September, 2017 at 11:13 | Posted in Economics | 1 Comment

DSGE models seem to take it as a religious tenet that consumption should be explained by a model of a representative agent maximizing his utility over an infinite lifetime without borrowing constraints. Doing so is called micro-founding the model. But economics is a behavioral science. If Keynes was right that individuals saved a constant fraction of their income, an aggregate model based on that assumption is micro-founded.FRANCE-US-ECONOMY-NOBEL-STIGLITZOf course, the economy consists of individuals who are different, but all of whom have a finite life and most of whom are credit constrained, and who do adjust their consumption behavior, if slowly, in response to changes in their economic environment. Thus, we also know that individuals do not save a constant fraction of their income, come what may. So both stories, the DSGE and the old-fashioned Keynesian, are simplifications. When they are incorporated into a simple macro-model, one is saying the economy acts as if… And then the question is, which provides a better description; a better set of prescriptions; and a better basis for future elaboration of the model. The answer is not obvious. The criticism of DSGE is thus not that it involves simplification: all models do. It is that it has made the wrong modelling choices, choosing complexity in areas where the core story of macroeconomic fluctuations could be told using simpler hypotheses, but simplifying in areas where much of the macroeconomic action takes place.

Joseph Stiglitz

Stiglitz is, of course, absolutely right.

DSGE models are worse than useless — and still, mainstream economists seem to be impressed by the ‘rigour’ brought to macroeconomics by New-Classical-New-Keynesian DSGE models and its rational expectations and microfoundations!

It is difficult to see why.

Take the rational expectations assumption. Rational expectations in the mainstream economists’ world imply that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality, it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis, we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents make ‘predictable errors’ in DSGE models doesn’t take us any closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.

Or take the consumption model built into the DSGE models that Stiglitz criticises. There, people are basically portrayed as treating time as a dichotomous phenomenon – today and the future — when contemplating making decisions and acting. How much should one consume today and how much in the future? Facing an intertemporal budget constraint of the form

ct + cf/(1+r) = ft + yt + yf/(1+r),

where ct is consumption today, cf is consumption in the future, ft is holdings of financial assets today, yt is labour incomes today, yf is labour incomes in the future, and r is the real interest rate, and having a lifetime utility function of the form

U = u(ct) + au(cf),

where a is the time discounting parameter, the representative agent (consumer) maximizes his utility when

u´(ct) = a(1+r)u´(cf).

This expression – the Euler equation – implies that the representative agent (consumer) is indifferent between consuming one more unit today or instead consuming it tomorrow. Typically using a logarithmic function form – u(c) = log c – which gives u´(c) = 1/c, the Euler equation can be rewritten as

1/ct = a(1+r)(1/cf),

or

cf/ct = a(1+r).

This importantly implies that according to the neoclassical consumption model that changes in the (real) interest rate and the ratio between future and present consumption move in the same direction.

So good, so far. But how about the real world? Is the neoclassical consumption as described in this kind of models in tune with the empirical facts? Hardly — the data and models are as a rule inconsistent!

In the Euler equation, we only have one interest rate, equated to the money market rate as set by the central bank. The crux is that — given almost any specification of the utility function – the two rates are actually often found to be strongly negatively correlated in the empirical literature. The data on returns and aggregate consumption simply are inconsistent with the DSGE models.

Although yours truly shares a lot of Stiglitz’ critique of DSGE modelling — “the standard DSGE model provides a poor basis for policy, and tweaks to it are unlikely to be helpful” —  it has to be said that his more general description of the history and state of modern macroeconomics is less convincing.  Stiglitz notices that some of the greatest deficiencies in DSGE models “relates to the treatment of uncertainty,” but doesn’t really follow up on that core difference between Keynesian ‘genuine uncertainty’ economics and neoclassical ‘stochastic risk’ economics. DSGE models are only the latest outgrow of neoclassical general equilibrium (Arrow-Debreu) economics. And that theory has never, and will never, be a good starting point for constructing good macroeconomic theory and models. When the foundation of the house you build is weak, it will never be somewhere you want to live, no matter how many new — and in Stiglitz’ view better — varieties of ‘micro-foundations’ you add. As Lance Taylor writes (personal communication):

Aside from consistent accounting (Walras’s Law), the Arrow-Debreu model is useless for practical macroeconomic purposes. Its “agents” could never carry through their assigned optimization exercises, a feat beyond the capacity of a universal Turing machine, let alone feeble human brains. The Sonnenschein-Mantel-Debreu theorem, moreover, shows that microeconomic rationality assumptions have no significant macroeconomic implications …

taylor_press_lowresThe phalanx of models marshalled by Stiglitz uses “expectation” to mean the first moment of some objective probability distribution, shared by some or all players. In contrast, Keynes adopted a non-frequentist (and non-Bayesian) interpretation, whereby we cannot make a probabilistic assessment about whether some future event(s) will occur. “Fundamental uncertainty” is a kissing cousin of Donald Rumsfeld’s “unknown unknowns.” Moments computed from numbers of the past are irrelevant. Plugging them into algebraic machines generates no useful information.

As with all great bodies of thought, many strands of what Keynes has to say appear to be contradictory. Strict macro accounting on one hand and belief in the fundamental instability of capitalism make up just one example. Bastard Keynesian is just an attempt to sidestep the contradictions. Whether Keynes’s own ideas will help us disentangle the unknowns of the future is unclear. It is clear that Stiglitz’s enormous tool bag will not be very helpful.

Rethinking expectations

14 September, 2017 at 08:53 | Posted in Economics | Comments Off on Rethinking expectations

The tiny little problem that there is no hard empirical evidence that verifies rational expectations models doesn’t usually bother its protagonists too much. Rational expectations überpriest Thomas Sargent has defended the epistemological status of the rational expectations hypothesis arguing that since it “focuses on outcomes and does not pretend to have behavioral content,” it has proved to be “a powerful tool for making precise statements.”

Precise, yes, but relevant and realistic? I’ll be dipped!

In their attempted rescue operations, rational expectationists try to give the picture that only heterodox economists like yours truly are critical of the rational expectations hypothesis.

But, on this, they are, simply … eh … wrong.

Let’s listen to Nobel laureate Edmund Phelps — hardly a heterodox economist — and what he has to say (emphasis added):

Question: In a new volume with Roman Frydman, “Rethinking Expectations: The Way Forward for Macroeconomics,” you say the vast majority of macroeconomic models over the last four decades derailed your “microfoundations” approach. Can you explain what that is and how it differs from the approach that became widely accepted by the profession?

frydAnswer: In the expectations-based framework that I put forward around 1968, we didn’t pretend we had a correct and complete understanding of how firms or employees formed expectations about prices or wages elsewhere. We turned to what we thought was a plausible and convenient hypothesis. For example, if the prices of a company’s competitors were last reported to be higher than in the past, it might be supposed that the company will expect their prices to be higher this time, too, but not that much. This is called “adaptive expectations:” You adapt your expectations to new observations but don’t throw out the past. If inflation went up last month, it might be supposed that inflation will again be high but not that high.

Q: So how did adaptive expectations morph into rational expectations?

A: The “scientists” from Chicago and MIT came along to say, we have a well-established theory of how prices and wages work. Before, we used a rule of thumb to explain or predict expectations: Such a rule is picked out of the air. They said, let’s be scientific. In their mind, the scientific way is to suppose price and wage setters form their expectations with every bit as much understanding of markets as the expert economist seeking to model, or predict, their behavior​. The rational expectations approach is to suppose that the people in the market form their expectations in the very same way that the economist studying their behavior forms her expectations: on the basis of her theoretical model.

Q: And what’s the consequence of this putsch?

A: Craziness for one thing. You’re not supposed to ask what to do if one economist has one model of the market and another economist a different model. The people in the market cannot follow both economists at the same time. One, if not both, of the economists, ​must be wrong. Another thing: It’s an important feature of capitalist economies that they permit speculation by people who have idiosyncratic views and an important feature of a modern capitalist economy that innovators conceive their new products and methods with little knowledge of whether the new things will be adopted — thus innovations. Speculators and innovators have to roll their own expectations. They can’t ring up the local professor to learn how. The professors should be ringing up the speculators and aspiring innovators. In short, expectations are causal variables in the sense that they are the drivers. They are not effects to be explained in terms of some trumped-up causes.

Q: So rather than live with variability, write a formula in stone!

A: What led to rational expectations was a fear of the uncertainty and, worse, the lack of understanding of how modern economies work. The rational expectationists wanted to bottle all that up and replace it with deterministic models of prices, wages, even share prices, so that the math looked like the math in rocket science. The rocket’s course can be modelled​ while a living modern economy’s course cannot be modelled​ to such an extreme. It yields up a formula for expectations that looks scientific because it has all our incomplete and not altogether correct understanding of how economies work inside of it, but it cannot have the incorrect and incomplete understanding of economies that the speculators and would-be innovators have.

Q: One of the issues I have with rational expectations is the assumption that we have perfect information, that there is no cost in acquiring that information. Yet the economics profession, including Federal Reserve policy makers, appears to have been hijacked by Robert Lucas.

A: You’re right that people are grossly uninformed, which is a far cry from what the rational expectations models suppose. Why are they misinformed? I think they don’t pay much attention to the vast information out there because they wouldn’t know what to do what to do with it if they had it. The fundamental fallacy on which rational expectations models are based is that everyone knows how to process the information they receive according to the one and only right theory of the world. The problem is that we don’t have a “right” model that could be certified as such by the National Academy of Sciences. And as long as we operate in a modern economy, there can never be such a model.

Bloomberg

The rational expectations hypothesis presumes consistent behaviour, where expectations do not display any persistent errors. In the world of rational expectations we are always, on average, hitting the bull’s eye. In the more realistic, open systems view, there is always the possibility (danger) of making mistakes that may turn out to be systematic. It is because of this, presumably, that we put so much emphasis on learning in our modern knowledge societies.

So, where does all this leave us? I think John Kay sums it up pretty well:

A scientific idea is not seminal because it influences the research agenda of PhD students. An important scientific advance yields conclusions that differ from those derived from other theories ​and establishes that these divergent conclusions are supported by observation. Yet as Prof Sargent disarmingly observed, “such empirical tests were rejecting too many good models” in the programme he had established with fellow Nobel laureates Bob Lucas and Ed Prescott. In their world, the validity of a theory is demonstrated if, after the event, and often with torturing of data and ad hoc adjustments that are usually called “imperfections”, it can be reconciled with already known facts – “calibrated”. Since almost everything can be “explained” in this way, the theory is indeed universal; no other approach is necessary, or even admissible …

I welcome their hatred

13 September, 2017 at 20:59 | Posted in Politics & Society | 3 Comments

 

Håll mitt hjärta

12 September, 2017 at 18:15 | Posted in Varia | Comments Off on Håll mitt hjärta

 

Whoo så bra. Slår till och med originalet!

The Letter

12 September, 2017 at 17:57 | Posted in Varia | 1 Comment

 

The scar deep inside

12 September, 2017 at 15:07 | Posted in Varia | Comments Off on The scar deep inside

 

What makes economics a science?

12 September, 2017 at 13:04 | Posted in Economics | 2 Comments

Well, if we are to believe most mainstream economists, models are what make economics a science.

economists3_royalblue_whiteIn a recent Journal of Economic Literature (1/2017) review of Dani Rodrik’s Economics Rules, renowned game theorist Ariel Rubinstein discusses Rodrik’s justifications for the view that “models make economics a science.” Although Rubinstein has some doubts about those justifications — models are not indispensable for telling good stories or clarifying things in general; logical consistency does not determine whether economic models are right or wrong; and being able to expand our set of ‘plausible explanations’ doesn’t make economics more of a science than good fiction does — he still largely subscribes to the scientific image of economics as a result of using formal models that help us achieve ‘clarity and consistency’.

There’s much in the review I like — Rubinstein shows a commendable scepticism on the prevailing excessive mathematization​ of economics, and he is much more in favour of a pluralist teaching of economics than most other mainstream economists — but on the core question, “the model is the message,” I beg to differ with the view put forward by both Rodrik and Rubinstein.

Economics is more than any other social science model-oriented. There are many reasons for this — the history of the discipline, having ideals coming from the natural sciences (especially physics), the search for universality (explaining as much as possible with as little as possible), rigour, precision, etc.

Mainstream economists want to explain social phenomena, structures and patterns, based on the assumption that the agents are acting in an optimizing (rational) way to satisfy given, stable and well-defined goals.

The procedure is analytical. The whole is broken down into its constituent parts so as to be able to explain (reduce) the aggregate (macro) as the result of interaction of its parts (micro).

Modern mainstream (neoclassical) economists ground their models on a set of core assumptions (CA) — basically describing the agents as ‘rational’ actors — and a set of auxiliary assumptions (AA). Together CA and AA make up what might be called the ‘ur-model’ (M) of all mainstream neoclassical economic models. Based on these two sets of assumptions, they try to explain and predict both individual (micro) and — most importantly — social phenomena (macro).

The core assumptions typically consist of:

CA1 Completeness — rational actors are able to compare different alternatives and decide which one(s) he prefers

CA2 Transitivity — if the actor prefers A to B, and B to C, he must also prefer A to C.

CA3 Non-satiation — more is preferred to less.

CA4 Maximizing expected utility — in choice situations under risk (calculable uncertainty) the actor maximizes expected utility.

CA4 Consistent efficiency equilibria — the actions of different individuals are consistent, and the interaction between them results​ in an equilibrium.

When describing the actors as rational in these models, the concept of rationality used is instrumental rationality – choosing consistently the preferred alternative, which is judged to have the best consequences for the actor given his in the model exogenously given wishes/interests/goals. How these preferences/wishes/interests/goals are formed is typically not considered to be within the realm of rationality, and a fortiori not constituting part of economics proper.

The picture given by this set of core assumptions (rational choice) is a rational agent with strong cognitive capacity that knows what alternatives he is facing, evaluates them carefully, calculates the consequences and chooses the one — given his preferences — that he believes has the best consequences according to him.

Weighing the different alternatives against each other, the actor makes a consistent optimizing (typically described as maximizing some kind of utility function) choice ​and acts accordingly.

Beside​ the core assumptions (CA) the model also typically has a set of auxiliary assumptions (AA) spatio-temporally specifying the kind of social interaction between ‘rational actors’ that take place in the model. These assumptions can be seen as giving answers to questions such as

AA1 who are the actors and where and when do they act

AA2 which specific goals do they have

AA3 what are their interests

AA4 what kind of expectations do they have

AA5 what are their feasible actions

AA6 what kind of agreements (contracts) can they enter into

AA7 how much and what kind of information do they possess

AA8 how do the actions of the different individuals/agents interact with each other

So, the ur-model of all economic models basically consists of a general specification of what (axiomatically) constitutes optimizing rational agents and a more specific description of the kind of situations in which these rational actors act (making AA serves as a kind of specification/restriction of the intended domain of application for CA and its deductively derived theorems). The list of assumptions can never be complete ​since there will always unspecified background assumptions and some (often) silent omissions (like closure, transaction costs, etc., regularly based on some negligibility and applicability considerations). The hope, however, is that the ‘thin’ list of assumptions shall be sufficient to explain and predict ‘thick’ phenomena in the real, complex, world.

In some (textbook) model depictions, we are essentially given the following structure,

A1, A2, … An
———————-
Theorem,

where a set of undifferentiated assumptions are used to infer a theorem.

This is, however, too vague and imprecise to be helpful, and does not give a true picture of the usual mainstream modelling​ strategy, where there’s a differentiation between a set of law-like hypotheses (CA) and a set of auxiliary assumptions (AA), giving the more adequate structure

CA1, CA2, … CAn & AA1, AA2, … AAn
———————————————–
Theorem

or,

CA1, CA2, … CAn
———————-
(AA1, AA2, … AAn) → Theorem,

more clearly underlining the function of AA as a set of (empirical, spatio-temporal) restrictions on the applicability of the deduced theorems.

This underlines the fact that specification of AA restricts the range of applicability of the deduced theorem. In the extreme cases we get

CA1, CA2, … CAn
———————
Theorem,

where the deduced theorems are analytical entities with universal and totally unrestricted applicability, or

AA1, AA2, … AAn
———————-
Theorem,

where the deduced theorem is transformed into an untestable tautological thought-experiment without any empirical commitment whatsoever beyond telling a coherent fictitious as-if story.

Not clearly differentiating between CA and AA means that we can’t make this all-important interpretative distinction and opens up for unwarrantedly ‘saving’ or ‘immunizing’ models from almost any kind of critique by simple equivocation between interpreting models as empirically empty and purely deductive-axiomatic analytical systems, or, respectively, as models with explicit empirical aspirations. Flexibility is usually something people deem positive, but in this methodological context, ​it’s more troublesome than a sign of real strength. Models that are compatible with everything, or come with unspecified domains of application, are worthless from a scientific point of view.

Economics — in contradistinction to logic and mathematics — ought to be an empirical science, and empirical testing of ‘axioms’ ought to be self-evidently relevant for such a discipline. For although the mainstream economist himself (implicitly) claims that his axiom is universally accepted as true and in no need of proof, that is in no way a justified reason for the rest of us to simpliciter accept the claim.

When applying deductivist thinking to economics, mainstream (neoclassical) economists usually set up ‘as if’ models based on the logic of idealization and a set of tight axiomatic assumptions from which consistent and precise inferences are made. The beauty of this procedure is, of course, that if the axiomatic premises are true, the conclusions necessarily follow. But — although the procedure is a marvellous tool in mathematics and axiomatic-deductivist systems, it is a poor guide for real-world systems. As Hans Albert has it on the neoclassical style of thought:

hans_albertScience progresses through the gradual elimination of errors from a large offering of rivalling ideas, the truth of which no one can know from the outset. The question of which of the many theoretical schemes will finally prove to be especially productive and will be maintained after empirical investigation cannot be decided a priori. Yet to be useful at all, it is necessary that they are initially formulated so as to be subject to the risk of being revealed as errors. Thus one cannot attempt to preserve them from failure at every price. A theory is scientifically relevant first of all because of its possible explanatory power, its performance, which is coupled with its informational content …

Clearly, it is possible to interpret the ‘presuppositions’ of a theoretical system … not as hypotheses, but simply as limitations to the area of application of the system in question. Since a relationship to reality is usually ensured by the language used in economic statements, in this case the impression is generated that a content-laden statement about reality is being made, although the system is fully immunized and thus without content. In my view that is often a source of self-deception in pure economic thought …

Most mainstream economic models are abstract, unrealistic and presenting mostly non-testable hypotheses. How then are they supposed to tell us anything about the world we live in?

Confronted with the massive empirical failures of their models and theories, mainstream economists often retreat into looking upon their models and theories as some kind of ‘conceptual exploration,’ and give up any hopes whatsoever of relating their theories and models to the real world. Instead of trying to bridge the gap between models and the world, one decides to look the other way.

To me, ​this kind of scientific defeatism is equivalent to surrendering our search for understanding the world we live in. It can’t be enough to prove or deduce things in a model world. If theories and models do not directly or indirectly tell us anything of the world we live in – then why should we waste any of our precious time on them?

The way axioms and theorems are formulated in mainstream (neoclassical) economics standardly leaves their specification without almost any restrictions whatsoever, safely making every imaginable evidence compatible with the all-embracing ‘theory’ — and a theory without informational content never risks being empirically tested and found falsified. Used in mainstream economics ‘thought experimental’ activities, it may, of course, ​be very ‘handy’, but totally void of any empirical value.

Mainstream economic models are nothing but broken pieces models. That kind of models can’t make economics a science

Küssen kann man nicht alleine

9 September, 2017 at 13:56 | Posted in Varia | 4 Comments

 

The essence of neoliberalism

9 September, 2017 at 12:35 | Posted in Politics & Society | Comments Off on The essence of neoliberalism

neoEconomists may not necessarily share the economic and social interests of the true believers and may have a variety of individual psychic states regarding the economic and social effects of the utopia which they cloak with mathematical reason. Nevertheless, they have enough specific interests in the field of economic science to contribute decisively to the production and reproduction of belief in the neoliberal utopia. Separated from the realities of the economic and social world by their existence and above all by their intellectual formation, which is most frequently purely abstract, bookish, and theoretical, they are particularly inclined to confuse the things of logic with the logic of things.

These economists trust models that they almost never have occasion to submit to the test of experimental verification and are led to look down upon the results of the other historical sciences, in which they do not recognise the purity and crystalline transparency of their mathematical games, whose true necessity and profound complexity they are often incapable of understanding. They participate and collaborate in a formidable economic and social change. Even if some of its consequences horrify them (they can join the socialist party and give learned counsel to its representatives in the power structure), it cannot displease them because, at the risk of a few failures, imputable to what they sometimes call “speculative bubbles”, it tends to give reality to the ultra-logical utopia (ultra-logical like certain forms of insanity) to which they consecrate their lives.

Pierre Bourdieu

Next Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.