Public debt — an economic necessity

26 September, 2017 at 09:06 | Posted in Economics | 2 Comments

wvickreyWe are not going to get out of the economic doldrums as long as we continue to be obsessed with the unreasoned ideological goal of reducing the so-called deficit. The “deficit” is not an economic sin but an economic necessity. […]

The administration is trying to bring the Titanic into harbor with a canoe paddle, while Congress is arguing over whether to use an oar or a paddle, and the Perot’s and budget balancers seem eager to lash the helm hard-a-starboard towards the iceberg. Some of the argument seems to be over which foot is the better one to shoot ourselves in. We have the resources in terms of idle manpower and idle plants to do so much, while the preachers of austerity, most of whom are in little danger of themselves suffering any serious consequences, keep telling us to tighten our belts and refrain from using the resources that lay idle all around us.

Alexander Hamilton once wrote “A national debt, if it be not excessive, would be for us a national treasure.” William Jennings Bryan used to declaim, “You shall not crucify mankind upon a cross of gold.” Today’s cross is not made of gold, but is concocted of a web of obfuscatory financial rectitude from which human values have been expunged.

William Vickrey

Advertisements

Seven sins of economics

23 September, 2017 at 11:44 | Posted in Economics | 4 Comments

There has always been some level of scepticism about the ability of economists to offer meaningful predictions and prognosis about economic and social phenomenon. That scepticism has heightened in the wake of the global financial crisis, leading to what is arguably the biggest credibility crisis the discipline has faced in the modern era.

assumpSome of the criticisms against economists are misdirected. But the major thrust of the criticisms does have bite.

There are seven key failings, or the ‘seven sins’, as I am going to call them, that have led economists to their current predicament. These include sins of commission as well as sins of omission.

Sin 1: Alice in Wonderland assumptions

The problem with economists is not that they make assumptions. After all, any theory or model will have to rely on simplifying assumptions … But when critical assumptions are made just to circumvent well-identified complexities in the quest to build elegant theories, such theories will simply end up being elegant fantasies.

Sin 2: Abuse of modelling

What compounds the sin of wild assumptions is the sin of careless modelling, and then selling that model as if it were a true depiction of an economy or society …

Sin 3: Intellectual capture

Several post-crisis assessments of the economy and of economics have pointed to intellectual capture as a key reason the profession, as a whole failed, to sound alarm bells about problems in the global economy, and failed to highlight flaws in the modern economic architecture …

Sin 4: The science obsession

The excessive obsession in the discipline to identify itself as science has been costly. This has led to a dangerous quest for standardization in the profession, leading many economists to mistake a model of the economy for ‘the model’ of the economy …

The science obsession has diminished the diversity of the profession, and arguably allowed complacency to take root in the run-up to the global financial crisis …

Sin 5: Perpetuating the myth of ‘the textbook’ and Econ 101

The quest for standardization has also led to an astonishing level of uniformity in the manner in which economists are trained, and in the manner in which economists train others. Central to this exercise are textbooks that help teach the lessons of ‘Econ 101’—lessons disconnected from reality as they are from the frontiers of economic research …

Sin 6: Ignoring society

What makes Econ 101 and a lot of mainstream economics particularly limiting is its neglect of the role of culture and social norms in determining economic outcomes even though classical economists such as Adam Smith and Karl Marx took care to emphasize how social norms and social interactions shape economic outcomes …

Economists typically don’t engage with other social sciences, even though insights from those disciplines have a direct bearing on the subjects of economic enquiry …

Sin 7: Ignoring history

One way in which economists could have compensated for the lack of engagement with other social sciences is by studying economic history. After all, studying economic history carefully can help us understand the social and institutional contexts in which particular economic models worked, or did not work …

But economic history has been relegated to the margins over the past several years, and many graduate students remain unacquainted with the subject still.

Pramit Bhattacharya

Game theory and the shaping of neoliberal capitalism

21 September, 2017 at 17:00 | Posted in Economics | 9 Comments

Prisoners_of_Reason Neoliberal subjectivity arises from the intricate pedagogy of game theory that comes to the fore in the Prisoner’s Dilemma game and is interchangeable with contemporary paradigmatic instrumental rationality. Rational choice is promoted as an exhaustive science of decision making, but only by smuggling in a characteristic​ confusion​ suggesting​ that everything of value​ to agents can be reflected in their appraisal of existential worth even though this is patently not the case in life viewed as a ‘fixed game.’ Without a critical and scrupulous pedagogy that carefully identifies as optional the assumptions necessary to operationalize strategic rationality, a new neoliberal understanding of capitalism will dominate the worldview​ of the student of game theory and inhabitant of neoliberal institutions.

When criticising game theory you often get the rather uninformative and vacuous answer that we all have to remember that game theory — as is mainstream neoclassical theory at large — is nothing but ‘as-if-theory’ built on ‘as-if-rationality.’ As Ariel Rubinstein has it, however, this only shows that “the phrase ‘as if’ is a way to avoid taking responsibility for the strong assumptions upon which economic models are founded” …

Missing the point — the quantitative ambitions of DSGE models

19 September, 2017 at 17:00 | Posted in Economics | Comments Off on Missing the point — the quantitative ambitions of DSGE models

A typical modern approach to writing a paper in DSGE macroeconomics is as follows:

o to establish “stylized facts” about the quantitative interrelationships of certain macroeconomic variables (e.g. moments of the data such as variances, autocorrelations, covariances, …) that have hitherto not been jointly explained;

o to write down a DSGE model of an economy subject to a defined set of shocks that aims to capture the described interrelationships; and

o to show that the model can “replicate” or “match” the chosen moments when it is fed with stochastic shocks generated by the assumed shock process …

reality-check_600_441_80However, the test imposed by matching DSGE models to the data is problematic in at least three respects:

First, the set of moments chosen to evaluate the model is largely arbitrary …

Second, for a given set of moments, there is no well-defined statistic to measure the goodness of fit of a DSGE model or to establish what constitutes an improvement in such a framework …

Third, the evaluation is complicated by the fact that, at some level, all economic models are rejected by the data … In addition, DSGE models frequently impose a number of restrictions that are in direct conflict with micro evidence. If a model has been rejected along some dimensions, then a statistic that measures the goodness-of-fit along other dimensions is meaningless …

Focusing on the quantitative fit of models also creates powerful incentives for researchers (i) to introduce elements that bear little resemblance to reality for the sake of achieving a better fit (ii) to introduce opaque elements that provide the researcher with free (or almost free) parameters and (iii) to introduce elements that improve the fit for the reported moments but deteriorate the fit along other unreported dimensions.

Albert Einstein observed that “not everything that counts can be counted, and not everything that can be counted counts.” DSGE models make it easy to offer a wealth of numerical results by following a well-defined set of methods (that requires one or two years of investment in graduate school, but is relatively straightforward to apply thereafter). There is a risk for researchers to focus too much on numerical predictions of questionable reliability and relevance that absorb a lot of time and effort rather than focusing on deeper conceptual questions that are of higher relevance for society.

Anton Korinek

Great essay, showing that ‘rigorous’ and ‘precise’ DSGE models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimetre if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not say anything about real-world economies.

Proving things ‘rigorously’ in DSGE models is at most a starting point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.

Mainstream economists think there is a gain from the DSGE style of modelling in its capacity to offer some kind of structure around which to organise discussions. To me, that sounds more like a religious theoretical-methodological dogma, where one paradigm rules in divine hegemony. That’s not progress. That’s the death of economics as a science.

As Kornik argues, using DSGE models “creates a bias towards models that have a well-behaved ergodic steady state.” Since we know that most real-world processes do not follow an ergodic distribution, this is, to say the least, problematic. To understand real world ‘non-routine’ decisions and unforeseeable changes in behaviour, stationary probability distributions are of no avail. In a world full of genuine uncertainty — where real historical time rules the roost — the probabilities that ruled the past are not those that will rule the future. Imposing invalid probabilistic assumptions on the data make all DSGE models statistically misspecified.

Advocates of DSGE modelling want to have deductively automated answers to fundamental causal questions. But to apply ‘thin’ methods we have to have ‘thick’ background knowledge of what’s going on in the real world, and not in idealized models. Conclusions can only be as certain as their premises — and that also applies to the quest for causality and forecasting predictability in DSGE models.

If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized that have to match reality, not the other way around. The modelling convention used when constructing DSGE models makes it impossible to fully incorporate things that we know are of paramount importance for understanding modern economies — such as income and wealth inequality, asymmetrical power relations and information, liquidity preference, just to mention a few.

Given all these fundamental problems for the use of these models and their underlying methodology, it is beyond understanding how the DSGE approach has come to be the standard approach in ‘modern’ macroeconomics. DSGE models are based on assumptions profoundly at odds with what we know about real-world economies. That also makes them little more than overconfident story-telling devoid of real scientific value. Macroeconomics would do much better with more substantive diversity and plurality.

Dangers of ‘running with the mainstream pack’

18 September, 2017 at 14:43 | Posted in Economics | 3 Comments


An absolutely fabulous speech — and Soskice and Carlin’s textbook Macroeconomics: Institutions, Instability, and the Financial System — that Dullien mentions at the beginning of his speech — is really a very good example of the problems you run into if you want to be ‘pluralist’ within the mainstream pack.

wendyCarlin and Soskice explicitly adapts a ‘New Keynesian’ framework including price rigidities and adding a financial system to the usual neoclassical macroeconomic set-up. But although I find things like the latter amendment an improvement, it’s definitely more difficult to swallow their methodological stance, and especially their non-problematized acceptance of the need for macroeconomic microfoundations.

Some months ago, another sorta-kinda ‘New Keynesian’, Paul Krugman, argued on his blog that the problem with the academic profession is that some macroeconomists aren’t “bothered to actually figure out” how the New Keynesian model with its Euler conditions — “based on the assumption that people have perfect access to capital markets, so that they can borrow and lend at the same rate” — really works. According to Krugman, this shouldn’t be hard at all — “at least it shouldn’t be for anyone with a graduate training in economics.”

Carlin & Soskice seem to share Krugman’s attitude. From the first page of the book, they start to elaborate their preferred 3-equations ‘New Keynesian’ macromodel. And after twenty-two pages, they have already come to specifying the demand side with the help of the Permanent Income Hypothesis and its Euler equations.

But if people — not the representative agent — at least sometimes can’t help being off their labour supply curve — as in the real world — then what are these hordes of Euler equations that you find ad nauseam in these ‘New Keynesian’ macromodels gonna help us? Yours truly’s doubts regarding the ‘New Keynesian’ modellers ‘ obsession with Euler equations is basically that, as with so many other assumptions in ‘modern’ macroeconomics, the Euler equations don’t fit reality.

All empirical sciences use simplifying or unrealistic assumptions in their modelling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

If we cannot show that the mechanisms or causes we isolate and handle in our models are stable, in the sense that what when we export them from our models to our target systems they do not change from one situation to another, then they only hold under ceteris paribus conditions and a fortiori are of limited value for our understanding, explanation and prediction of our real world target system.

No matter how many convoluted refinements of concepts made in the model, if the “successive approximations” do not result in models similar to reality in the appropriate respects (such as structure, isomorphism etc), the surrogate system becomes a substitute system that does not bridge to the world but rather misses its target.

From this methodological perspective, yours truly has to conclude that Carlin’s and Soskice’s microfounded macroeconomic model is a rather unimpressive attempt at legitimizing using fictitious idealizations — such as Euler equations — for reasons more to do with model tractability than with a genuine interest in understanding and explaining features of real economies.

Running with the mainstream pack is not a good idea if you want to develop realist and relevant economics.

Putting predictions to the test

17 September, 2017 at 11:58 | Posted in Economics | 1 Comment

tetIt is the somewhat gratifying lesson of Philip Tetlock’s new book that people who make prediction their business — people who appear as experts on television, get quoted in newspaper articles, advise governments and businesses, and participate in punditry roundtables — are no better than the rest of us. When they’re wrong, they’re rarely held accountable, and they rarely admit it, either. They insist that they were just off on timing, or blindsided by an improbable event, or almost right, or wrong for the right reasons. They have the same repertoire of self-justifications that everyone has, and are no more inclined than anyone else to revise their beliefs about the way the world works, or ought to work, just because they made a mistake. No one is paying you for your gratuitous opinions about other people, but the experts are being paid, and Tetlock claims that the better known and more frequently quoted they are, the less reliable their guesses about the future are likely to be. The accuracy of an expert’s predictions actually has an inverse relationship to his or her self-confidence, renown, and, beyond a certain point, depth of knowledge. People who follow current events by reading the papers and newsmagazines regularly can guess what is likely to happen about as accurately as the specialists whom the papers quote. Our system of expertise is completely inside out: it rewards bad judgments over good ones.

The New Yorker

Mainstream neoclassical economists often maintain – usually referring to the methodological individualism of Milton Friedman — that it doesn’t matter if the assumptions of the models they use are realistic or not. What matters is if the predictions are right or not. But, if so, then the only conclusion we can make is — throw away the garbage! Because, oh dear, oh dear, how wrong they have been!

When Simon Potter a couple of years ago analyzed the predictions that the Federal Reserve Bank of New York did on the development of real GDP and unemployment for the years 2007-2010, it turned out that the predictions were wrong with respectively 5.9% and 4.4% — which is equivalent to 6 millions of unemployed:

Economic forecasters never expect to predict precisely. One way of measuring the accuracy of their forecasts is against previous forecast errors. When judged by forecast error performance metrics from the macroeconomic quiescent period that many economists have labeled the Great Moderation, the New York Fed research staff forecasts, as well as most private sector forecasts for real activity before the Great Recession, look unusually far off the mark …

Using a similar approach to Reifschneider and Tulip but including forecast errors for 2007, one would have expected that 70 percent of the time the unemployment rate in the fourth quarter of 2009 should have been within 0.7 percentage point of a forecast made in April 2008. The actual forecast error was 4.4 percentage points, equivalent to an unexpected increase of over 6 million in the number of unemployed workers. Under the erroneous assumption that the 70 percent projection error band was based on a normal distribution, this would have been a 6 standard deviation error, a very unlikely occurrence indeed.

In other words — the “rigorous” and “precise” macroeconomic mathematical-statistical forecasting models were wrong. And the rest of us have to pay.

Potter is not the only one who lately has criticized the forecasting business. John Mingers comes to essentially the same conclusion when scrutinizing it from a somewhat more theoretical angle:

It is clearly the case that experienced modellers could easily come up with significantly different models based on the same set of data thus undermining claims to researcher-independent objectivity. This has been demonstrated empirically by Magnus and Morgan (1999) who conducted an experiment in which an apprentice had to try to replicate the analysis of a dataset that might have been carried out by three different experts (Leamer, Sims, and Hendry) following their published guidance. In all cases the results were different from each other, and different from that which would have been produced by the expert, thus demonstrating the importance of tacit knowledge in statistical analysis.

The empirical and theoretical evidence is clear. Predictions and forecasts are inherently difficult to make in a socio-economic domain where genuine uncertainty and unknown unknowns often rule the roost. The real processes that underly the time series that economists use to make their predictions and forecasts do not conform with the assumptions made in the applied statistical and econometric models. Much less is a fortiori predictable than standardly — and uncritically — assumed. The forecasting models fail to a large extent because the kind of uncertainty that faces humans and societies actually makes the models strictly seen inapplicable. The future is inherently unknowable — and using statistics, econometrics, decision theory or game theory, does not in the least overcome this ontological fact. The economic future is not something that we normally can predict in advance. Better then to accept that as a rule “we simply do not know.”

In New York State, Section 899 of the Code of Criminal Procedure provides that persons “Pretending to Forecast the Future” shall be considered disorderly under subdivision 3, Section 901 of the Code and liable to a fine of $250 and/or six months in prison. Although the law does not apply to “ecclesiastical bodies acting in good faith and without fees,” I’m not sure where that leaves macroeconomic model-builders and other forecasters.

The accuracy of the predictions that experts make certainly seem to have an inverse relationship to their self-confidence. Being cocky and wrong is a lethal combination — and economists are often wrong and hardly known for being particularly modest people …

The growth of the Internet will slow drastically, as the flaw in “Metcalfe’s law”–which states that the number of potential connections in a network is proportional to the square of the number of participants–becomes apparent: most people have nothing to say to each other! By 2005 or so, it will become clear that the Internet’s impact on the economy has been no greater than the fax machine’s.

Paul Krugman

Economic growth and gender

16 September, 2017 at 17:59 | Posted in Economics | 2 Comments

Wealth_And_Poverty_Of_NationsThe economic implications of gender discrimination are most serious. To deny women is to deprive a country of labor​ and talent, but — even worse — to undermine the drive to achievement of boys and men. One cannot rear young people in such wise that half of them think themselves superior by biology, without dulling ambition and devaluing accomplishment … To be sure, any society will have its achievers no matter what, if only because it has its own division of tasks and spoils. But it cannot compete with other societies that ask performance from the full pool of talent.

In general, the best clue to a nation’s growth and development potential is the status and role of women. This is the greatest handicap of Muslim Middle Eastern societies today, the flaw that most bars them from modernity.

Any economist who thinks that growth and development have​ little or nothing to do with cultural and religious imperatives ought to read Landes’ masterful survey of what makes some countries so rich and others so poor.

Stiglitz and the full force of Sonnenschein-Mantel-Debreu

16 September, 2017 at 15:43 | Posted in Economics | 1 Comment

In his recent article on Where Modern Macroeconomics Went Wrong, Joseph Stiglitz acknowledges that his approach “and that of DSGE models begins with the same starting point: the competitive equilibrium model of Arrow and Debreu.”

This is probably also the reason why Stiglitz’ critique doesn’t go far enough.

It’s strange that mainstream macroeconomists still stick to a general equilibrium paradigm more than forty years after the Sonnenschein-Mantel-Debreu theorem — SMD — devastatingly showed that it  is an absolute non-starter for building realist and relevant macroeconomics:

SMD theory means that assumptions guaranteeing good behavior at the microeconomic level do not carry over to the aggregate level or to qualitative features of the equilibrium …

24958274Given how sweeping the changes wrought by SMD theory seem to be, it is understandable that some very broad statements about the character of general equilibrium theory were made. Fifteen years after General Competitive Analysis, Arrow (1986) stated that the hypothesis of rationality had few implications at the aggregate level. Kirman (1989) held that general equilibrium theory could not generate falsifiable propositions, given that almost any set of data seemed consistent with the theory …

S. Abu Turab Rizvi

New Classical-Real Business Cycles-DSGE-New Keynesian microfounded macromodels try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And — worse still — something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for since SMD unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee either stability or uniqueness of the equilibrium solution.

Opting for cloned representative agents that are all identical is of course not a real solution to the fallacy of composition that SMD points to. Representative agent models are — as I have argued at length here — rather an evasion whereby issues of distribution, coordination, heterogeneity — everything that really defines macroeconomics — are swept under the rug.

Of course, most macroeconomists know that to use a representative agent is a flagrantly illegitimate method of ignoring real aggregation issues. They keep on with their business, nevertheless, just because it significantly simplifies what they are doing. It reminds — not so little — of the drunkard who has lost his keys in some dark place and deliberately chooses to look for them under a neighbouring street light just because it is easier to see there.

Frank-Ackerman_0General equilibrium is fundamental to economics on a more normative level as well. A story about Adam Smith, the invisible hand, and the merits of markets pervades introductory textbooks, classroom teaching, and contemporary political discourse. The intellectual foundation of this story rests on general equilibrium, not on the latest mathematical excursions. If the foundation of everyone’s favourite economics story is now known to be unsound — and according to some, uninteresting as well — then the profession owes the world a bit of an explanation.

Frank Ackerman

Almost a century and a half after Léon Walras founded general equilibrium theory, economists still have not been able to show that markets lead economies to equilibria. We do know that — under very restrictive assumptions — equilibria do exist, are unique and are Pareto-efficient. But — what good does that do? As long as we cannot show that there are convincing reasons to suppose there are forces which lead economies to equilibria — the value of general equilibrium theory is nil. As long as we cannot really demonstrate that there are forces operating — under reasonable, relevant and at least mildly realistic conditions — at moving markets to equilibria, there cannot really be any sustainable reason for anyone to pay any interest or attention to this theory.

A stability that can only be proved by assuming Santa Claus conditions is of no avail. Most people do not believe in Santa Claus anymore. And for good reasons — Santa Claus is for kids.

Continuing to model a world full of agents behaving as economists — ‘often wrong, but never uncertain’ — and still not being able to show that the system under reasonable assumptions converges to equilibrium (or simply assume the problem away), is a gross misallocation of intellectual resources and time.

 kirman

The full force of the Sonnenschein, Mantel, and Debreu (SMD) result is often not appreciated. Without stability or uniqueness, the intrinsic interest of economic analysis based on the general equilibrium model is extremely limited …

The usual way out of this problem is to assume a “representative agent,” and this obviously generates a unique equilibrium. However, the assumption of such an individual is open to familiar criticisms (Kirman 1992; Stoker 1995), and recourse to this creature raises one of the basic problems encountered on the route to the place where general equilibrium has found itself: the problem of aggregation. In fact, we know that, in general, there is no simple relation between individual and aggregate behavior, and to assume that behavior at one level can be assimilated to that at the other is simply erroneous …

The very fact that we observe, in reality, increasing amounts of resources being devoted to informational acquisition and processing implies that the standard general equilibrium model and the standard models of financial markets are failing to capture important aspects of reality.

Alan Kirman

Where modern macroeconomics went wrong

14 September, 2017 at 11:13 | Posted in Economics | 1 Comment

DSGE models seem to take it as a religious tenet that consumption should be explained by a model of a representative agent maximizing his utility over an infinite lifetime without borrowing constraints. Doing so is called micro-founding the model. But economics is a behavioral science. If Keynes was right that individuals saved a constant fraction of their income, an aggregate model based on that assumption is micro-founded.FRANCE-US-ECONOMY-NOBEL-STIGLITZOf course, the economy consists of individuals who are different, but all of whom have a finite life and most of whom are credit constrained, and who do adjust their consumption behavior, if slowly, in response to changes in their economic environment. Thus, we also know that individuals do not save a constant fraction of their income, come what may. So both stories, the DSGE and the old-fashioned Keynesian, are simplifications. When they are incorporated into a simple macro-model, one is saying the economy acts as if… And then the question is, which provides a better description; a better set of prescriptions; and a better basis for future elaboration of the model. The answer is not obvious. The criticism of DSGE is thus not that it involves simplification: all models do. It is that it has made the wrong modelling choices, choosing complexity in areas where the core story of macroeconomic fluctuations could be told using simpler hypotheses, but simplifying in areas where much of the macroeconomic action takes place.

Joseph Stiglitz

Stiglitz is, of course, absolutely right.

DSGE models are worse than useless — and still, mainstream economists seem to be impressed by the ‘rigour’ brought to macroeconomics by New-Classical-New-Keynesian DSGE models and its rational expectations and microfoundations!

It is difficult to see why.

Take the rational expectations assumption. Rational expectations in the mainstream economists’ world imply that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality, it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis, we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents make ‘predictable errors’ in DSGE models doesn’t take us any closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.

Or take the consumption model built into the DSGE models that Stiglitz criticises. There, people are basically portrayed as treating time as a dichotomous phenomenon – today and the future — when contemplating making decisions and acting. How much should one consume today and how much in the future? Facing an intertemporal budget constraint of the form

ct + cf/(1+r) = ft + yt + yf/(1+r),

where ct is consumption today, cf is consumption in the future, ft is holdings of financial assets today, yt is labour incomes today, yf is labour incomes in the future, and r is the real interest rate, and having a lifetime utility function of the form

U = u(ct) + au(cf),

where a is the time discounting parameter, the representative agent (consumer) maximizes his utility when

u´(ct) = a(1+r)u´(cf).

This expression – the Euler equation – implies that the representative agent (consumer) is indifferent between consuming one more unit today or instead consuming it tomorrow. Typically using a logarithmic function form – u(c) = log c – which gives u´(c) = 1/c, the Euler equation can be rewritten as

1/ct = a(1+r)(1/cf),

or

cf/ct = a(1+r).

This importantly implies that according to the neoclassical consumption model that changes in the (real) interest rate and the ratio between future and present consumption move in the same direction.

So good, so far. But how about the real world? Is the neoclassical consumption as described in this kind of models in tune with the empirical facts? Hardly — the data and models are as a rule inconsistent!

In the Euler equation, we only have one interest rate, equated to the money market rate as set by the central bank. The crux is that — given almost any specification of the utility function – the two rates are actually often found to be strongly negatively correlated in the empirical literature. The data on returns and aggregate consumption simply are inconsistent with the DSGE models.

Although yours truly shares a lot of Stiglitz’ critique of DSGE modelling — “the standard DSGE model provides a poor basis for policy, and tweaks to it are unlikely to be helpful” —  it has to be said that his more general description of the history and state of modern macroeconomics is less convincing.  Stiglitz notices that some of the greatest deficiencies in DSGE models “relates to the treatment of uncertainty,” but doesn’t really follow up on that core difference between Keynesian ‘genuine uncertainty’ economics and neoclassical ‘stochastic risk’ economics. DSGE models are only the latest outgrow of neoclassical general equilibrium (Arrow-Debreu) economics. And that theory has never, and will never, be a good starting point for constructing good macroeconomic theory and models. When the foundation of the house you build is weak, it will never be somewhere you want to live, no matter how many new — and in Stiglitz’ view better — varieties of ‘micro-foundations’ you add. As Lance Taylor writes (personal communication):

Aside from consistent accounting (Walras’s Law), the Arrow-Debreu model is useless for practical macroeconomic purposes. Its “agents” could never carry through their assigned optimization exercises, a feat beyond the capacity of a universal Turing machine, let alone feeble human brains. The Sonnenschein-Mantel-Debreu theorem, moreover, shows that microeconomic rationality assumptions have no significant macroeconomic implications …

taylor_press_lowresThe phalanx of models marshalled by Stiglitz uses “expectation” to mean the first moment of some objective probability distribution, shared by some or all players. In contrast, Keynes adopted a non-frequentist (and non-Bayesian) interpretation, whereby we cannot make a probabilistic assessment about whether some future event(s) will occur. “Fundamental uncertainty” is a kissing cousin of Donald Rumsfeld’s “unknown unknowns.” Moments computed from numbers of the past are irrelevant. Plugging them into algebraic machines generates no useful information.

As with all great bodies of thought, many strands of what Keynes has to say appear to be contradictory. Strict macro accounting on one hand and belief in the fundamental instability of capitalism make up just one example. Bastard Keynesian is just an attempt to sidestep the contradictions. Whether Keynes’s own ideas will help us disentangle the unknowns of the future is unclear. It is clear that Stiglitz’s enormous tool bag will not be very helpful.

 

Rethinking expectations

14 September, 2017 at 08:53 | Posted in Economics | Comments Off on Rethinking expectations

The tiny little problem that there is no hard empirical evidence that verifies rational expectations models doesn’t usually bother its protagonists too much. Rational expectations überpriest Thomas Sargent has defended the epistemological status of the rational expectations hypothesis arguing that since it “focuses on outcomes and does not pretend to have behavioral content,” it has proved to be “a powerful tool for making precise statements.”

Precise, yes, but relevant and realistic? I’ll be dipped!

In their attempted rescue operations, rational expectationists try to give the picture that only heterodox economists like yours truly are critical of the rational expectations hypothesis.

But, on this, they are, simply … eh … wrong.

Let’s listen to Nobel laureate Edmund Phelps — hardly a heterodox economist — and what he has to say (emphasis added):

Question: In a new volume with Roman Frydman, “Rethinking Expectations: The Way Forward for Macroeconomics,” you say the vast majority of macroeconomic models over the last four decades derailed your “microfoundations” approach. Can you explain what that is and how it differs from the approach that became widely accepted by the profession?

frydAnswer: In the expectations-based framework that I put forward around 1968, we didn’t pretend we had a correct and complete understanding of how firms or employees formed expectations about prices or wages elsewhere. We turned to what we thought was a plausible and convenient hypothesis. For example, if the prices of a company’s competitors were last reported to be higher than in the past, it might be supposed that the company will expect their prices to be higher this time, too, but not that much. This is called “adaptive expectations:” You adapt your expectations to new observations but don’t throw out the past. If inflation went up last month, it might be supposed that inflation will again be high but not that high.

Q: So how did adaptive expectations morph into rational expectations?

A: The “scientists” from Chicago and MIT came along to say, we have a well-established theory of how prices and wages work. Before, we used a rule of thumb to explain or predict expectations: Such a rule is picked out of the air. They said, let’s be scientific. In their mind, the scientific way is to suppose price and wage setters form their expectations with every bit as much understanding of markets as the expert economist seeking to model, or predict, their behavior​. The rational expectations approach is to suppose that the people in the market form their expectations in the very same way that the economist studying their behavior forms her expectations: on the basis of her theoretical model.

Q: And what’s the consequence of this putsch?

A: Craziness for one thing. You’re not supposed to ask what to do if one economist has one model of the market and another economist a different model. The people in the market cannot follow both economists at the same time. One, if not both, of the economists, ​must be wrong. Another thing: It’s an important feature of capitalist economies that they permit speculation by people who have idiosyncratic views and an important feature of a modern capitalist economy that innovators conceive their new products and methods with little knowledge of whether the new things will be adopted — thus innovations. Speculators and innovators have to roll their own expectations. They can’t ring up the local professor to learn how. The professors should be ringing up the speculators and aspiring innovators. In short, expectations are causal variables in the sense that they are the drivers. They are not effects to be explained in terms of some trumped-up causes.

Q: So rather than live with variability, write a formula in stone!

A: What led to rational expectations was a fear of the uncertainty and, worse, the lack of understanding of how modern economies work. The rational expectationists wanted to bottle all that up and replace it with deterministic models of prices, wages, even share prices, so that the math looked like the math in rocket science. The rocket’s course can be modelled​ while a living modern economy’s course cannot be modelled​ to such an extreme. It yields up a formula for expectations that looks scientific because it has all our incomplete and not altogether correct understanding of how economies work inside of it, but it cannot have the incorrect and incomplete understanding of economies that the speculators and would-be innovators have.

Q: One of the issues I have with rational expectations is the assumption that we have perfect information, that there is no cost in acquiring that information. Yet the economics profession, including Federal Reserve policy makers, appears to have been hijacked by Robert Lucas.

A: You’re right that people are grossly uninformed, which is a far cry from what the rational expectations models suppose. Why are they misinformed? I think they don’t pay much attention to the vast information out there because they wouldn’t know what to do what to do with it if they had it. The fundamental fallacy on which rational expectations models are based is that everyone knows how to process the information they receive according to the one and only right theory of the world. The problem is that we don’t have a “right” model that could be certified as such by the National Academy of Sciences. And as long as we operate in a modern economy, there can never be such a model.

Bloomberg

The rational expectations hypothesis presumes consistent behaviour, where expectations do not display any persistent errors. In the world of rational expectations we are always, on average, hitting the bull’s eye. In the more realistic, open systems view, there is always the possibility (danger) of making mistakes that may turn out to be systematic. It is because of this, presumably, that we put so much emphasis on learning in our modern knowledge societies.

So, where does all this leave us? I think John Kay sums it up pretty well:

A scientific idea is not seminal because it influences the research agenda of PhD students. An important scientific advance yields conclusions that differ from those derived from other theories ​and establishes that these divergent conclusions are supported by observation. Yet as Prof Sargent disarmingly observed, “such empirical tests were rejecting too many good models” in the programme he had established with fellow Nobel laureates Bob Lucas and Ed Prescott. In their world, the validity of a theory is demonstrated if, after the event, and often with torturing of data and ad hoc adjustments that are usually called “imperfections”, it can be reconciled with already known facts – “calibrated”. Since almost everything can be “explained” in this way, the theory is indeed universal; no other approach is necessary, or even admissible …

Next Page »

Blog at WordPress.com.
Entries and comments feeds.