Roman Frydman on the ‘rational expectations’ hoax

21 Nov, 2015 at 18:04 | Posted in Economics | 2 Comments

Lynn Parramore: It seems obvious that both fundamentals and psychology matter. Why haven’t economists developed an approach to modeling stock-price movements that incorporates both?

Roman Frydman: It took a while to realize that the reason is relatively straightforward. Economists have relied on models that assume away unforeseeable change. As different as they are, rational expectations and behavioral-finance models represent the market with what mathematicians call a probability distribution – a rule that specifies in advance the chances of absolutely everything that will ever happen.

In a world in which nothing unforeseen ever happened, rational individuals could compute precisely whatever they had to know about the future to make profit-maximizing decisions. Presuming that they do not fully rely on such computations and resort to psychology would mean that they forego profit opportunities.

LP: So this is why I often hear that supporters of the Rational Expectations Hypothesis imagine people as autonomous agents that mechanically make decisions in order to maximize profits?

fubYes! What has been misunderstood is that this purely computational notion of economic rationality is an artifact of assuming away unforeseeable change.

Imagine that I have a probabilistic model for stock prices and dividends, and I hypothesize that my model shows how prices and dividends actually unfold. Now I have to suppose that rational people will have exactly the same interpretation as I do — after all, I’m right and I have accounted for all possibilities … This is essentially the idea underpinning the Rational Expectations Hypothesis …

LP: So the only truth is the non-existence of the one true model?

RF: It’s the genuine openness that makes our ideas – and education – more exciting. Students can think about things in an open, yet structured way. We don’t lose the structure; we just renounce the pretense of exact knowledge.

Economics is not mechanistic. It requires understanding of history, politics, and psychology.  Some say that economics is an art, but NREH is actually rigorous economics. It simply recognizes that there’s a limit to what we can know.

Economists may fear that acknowledging this limit would make economic analysis unscientific. But that fear is rooted in a misconception of what the social scientific enterprise should be. Scientific knowledge generates empirically relevant regularities that are likely to be durable. In economics, that knowledge can only be qualitative, and grasping this insight is essential to its scientific status.  Until now, we have been wasting time looking for a model that would tell us exactly how the market works.

LP: Chasing the Holy Grail?

RF: Yes. It’s an illusion. We’ve trained generation after generation in this fruitless task, and it leads to extreme thinking. Fama and Shiller need not see themselves in irreconcilable opposition. There is no one truth. They both have had critical insights, and NREH acknowledges that and builds on their work.

Huffington Post

2-format2010Roman Frydman is Professor of Economics at New York University and a long time critic of the rational expectations hypothesis. In his seminal 1982 American Economic Review article Towards an Understanding of Market Processes: Individual Expectations, Learning, and Convergence to Rational Expectations Equilibrium — an absolute must-read for anyone with a serious interest in understanding what are the issues in the present discussion on rational expectations as a modeling assumption — he showed that models founded on the rational expectations hypothesis are inadequate as representations of economic agents’ decision making.

Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. As yours truly has tried to show in On the use and misuse of theories and models in economics there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place macroeconomic models building on representative actors and rational expectations-microfoundations in the dustbin of pseudo-science.

 

Lacrimosa

20 Nov, 2015 at 19:41 | Posted in Varia | Comments Off on Lacrimosa

 

APTOPIX-France-Paris-Attacks-man-at-cafe

George Washington

20 Nov, 2015 at 18:32 | Posted in Politics & Society | Comments Off on George Washington

quote-the-bosom-of-america-is-open-to-receive-not-only-the-opulent-and-respectable-stranger-but-the-george-washington-354864

Models and the poverty of atomistic behavioural assumptions

20 Nov, 2015 at 10:37 | Posted in Economics | Comments Off on Models and the poverty of atomistic behavioural assumptions

 

Is macroeconomics for real?

19 Nov, 2015 at 12:10 | Posted in Economics | 5 Comments

861cf344575acd50ed67b35d88615f2318610d8148e8c471ad10ca0132cda91eEmpirically, far from isolating a microeconomic core, real-business-cycle models, as with other representative-agent models, use macroeconomic aggregates for their testing and estimation. Thus, to the degree that such models are successful in explaining empirical phenomena, they point to the ontological centrality of macroeconomic and not to microeconomic entities … At the empirical level, even the new classical representative-agent models are fundamentally macroeconomic in content …

The nature of microeconomics and macroeconomics — as they are currently practiced — undermines the prospects for a reduction of macroeconomics to microeconomics. Both microeconomics and macroeconomics must refer to irreducible macroeconomic entities.

Kevin Hoover

Kevin Hoover has been writing on microfoundations for now more than 25 years, and is beyond any doubts the one economist/econometrician/methodologist who has thought most on the issue. It’s always interesting to compare his qualified and methodologically founded assessment on the representative-agent-rational-expectations microfoundationalist program with the more or less apologetic views of freshwater economists like Robert Lucas:

hoovGiven what we know about representative-agent models, there is not the slightest reason for us to think that the conditions under which they should work are fulfilled. The claim that representative-agent models provide microfundations succeeds only when we steadfastly avoid the fact that representative-agent models are just as aggregative as old-fashioned Keynesian macroeconometric models. They do not solve the problem of aggregation; rather they assume that it can be ignored. While they appear to use the mathematics of microeconomis, the subjects to which they apply that microeconomics are aggregates that do not belong to any agent. There is no agent who maximizes a utility function that represents the whole economy subject to a budget constraint that takes GDP as its limiting quantity. This is the simulacrum of microeconomics, not the genuine article …

[W]e should conclude that what happens to the microeconomy is relevant to the macroeconomy but that macroeconomics has its own modes of analysis … [I]t is almost certain that macroeconomics cannot be euthanized or eliminated. It shall remain necessary for the serious economist to switch back and forth between microeconomics and a relatively autonomous macroeconomics depending upon the problem in hand.

Instead of just methodologically sleepwalking into their models, modern followers of the Lucasian microfoundational program ought to do some reflection and at least try to come up with a sound methodological justification for their position. Just looking the other way won’t do. Writes Hoover:

garciaThe representative-­agent program elevates the claims of microeconomics in some version or other to the utmost importance, while at the same time not acknowledging that the very microeconomic theory it privileges undermines, in the guise of the Sonnenschein-­Debreu­-Mantel theorem, the likelihood that the utility function of the representative agent will be any direct analogue of a plausible utility function for an individual agent … The new classicals treat [the difficulties posed by aggregation] as a non-issue, showing no apprciation of the theoretical work on aggregation and apparently unaware that earlier uses of the representative-agent model had achieved consistency wiyh theory only at the price of empirical relevance.

Where ‘New Keynesian’ and New Classical economists think that they can rigorously deduce the aggregate effects of (representative) actors with their reductionist microfoundational methodology, they — as argued in my On the use and misuse of theories and models in economics — have to put a blind eye on the emergent properties that characterize all open social and economic systems. The interaction between animal spirits, trust, confidence, institutions, etc., cannot be deduced or reduced to a question answerable on the individual level. Macroeconomic structures and phenomena have to be analyzed also on their own terms.

Just playing games? Count me out!

18 Nov, 2015 at 15:41 | Posted in Economics | 2 Comments

I have spent a considerable part of my life building economic models, and examining the models that other economists have built. I believe that I am making reasonably good use of my talents in an attempt to understand the social world.two-scientists-like-to-play-simple-gamesI have no fellow-feeling with those economic theorists who, off the record at seminars and conferences, admit that they are only playing a game with other theorists. If their models are not intended seriously, I want to say (and do say when I feel sufficiently combative), why do they expect me to spend my time listening to their expositions? Count me out of the game.

Robert Sugden

Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies and institutions are those based on rational expectations and representative actors. As yours truly has tried to show in On the use and misuse of theories and models in economics there is really no support for this conviction at all. On the contrary. If we want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make, it is high time to place macroeconomic models building on representative actors and rational expectations-microfoundations where they belong – in the dustbin.

For if this microfounded macroeconomics has nothing to say about the real world and the economic problems out there, why should we care about it? The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than hand waving that give us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

The real macroeconomic challenge is to accept uncertainty and still try to explain why economic transactions take place – instead of simply conjuring the problem away by assuming rational expectations and treating uncertainty as if it was possible to reduce it to stochastic risk. That is scientific cheating. And it has been going on for too long now. If that’s the kind of game you want to play — count me out!

The open society and its enemies

15 Nov, 2015 at 16:37 | Posted in Politics & Society | Comments Off on The open society and its enemies

Unlimited tolerance must lead to the disappearance of tolerance. If we extend unlimited tolerance even to those who are intolerant, if we are not prepared to defend a tolerant society against the onslaught of the intolerant, then the tolerant will be destroyed, and tolerance with them … We should therefore claim, in the name of tolerance, the right not to tolerate the intolerant.

Karl Popper The Open Society and Its Enemies (1945)

12743454_f496

Bleu Blanc Rouge

14 Nov, 2015 at 22:58 | Posted in Varia | Comments Off on Bleu Blanc Rouge

 

Though I speak with the tongues of angels,
If I have not love…
My words would resound with but a tinkling cymbal.
And though I have the gift of prophesy…
And understand all mysteries…
and all knowledge…
And though I have all faith
So that I could remove mountains,
If I have not love…
I am nothing.

November 13th, 2015 — a date which will live in infamy

14 Nov, 2015 at 15:11 | Posted in Economics | Comments Off on November 13th, 2015 — a date which will live in infamy

 
Paris-Attack-390x250

The verdict of history will be harsh.

Are economic models ‘true enough’?

13 Nov, 2015 at 19:39 | Posted in Theory of Science & Methodology | 2 Comments

Stylized facts are close kin of ceteris paribus laws. They are ‘broad generalizations true in essence, though perhaps not in detail’. They play a major role in economics, constituting explananda that economic models are required to explain. Models of economic growth, for example, are supposed to explain the (stylized) fact that the profit rate is constant. The unvarnished fact of course is that profit rates are not constant. All sorts of non-economic factors — e.g., war, pestilence, drought, political chicanery — interfere. Manifestly, stylized facts are not (what philosophers would call) facts, for the simple reason that they do not actually obtain. It might seem then that economics takes itself to be required to explain why known falsehoods are true. (Voodoo economics, indeed!) This can’t be correct. truth_and_liesRather, economics is committed to the view that the claims it recognizes as stylized facts are in the right neighborhood, and that their being in the right neighborhood is something economic models should account for. The models may show them to be good approximations in all cases, or where deviations from the economically ideal are small, or where economic factors dominate non-economic ones. Or they might afford some other account of their often being nearly right. The models may diverge as to what is actually true, or as to where, to what degree, and why the stylized facts are as good as they are. But to fail to acknowledge the stylized facts would be to lose valuable economic information (for example, the fact that if we control for the effects of such non-economic interference as war, disease, and the president for life absconding with the national treasury, the profit rate is constant.) Stylized facts figure in other social sciences as well. I suspect that under a less alarming description, they occur in the natural sciences too. The standard characterization of the pendulum, for example, strikes me as a stylized fact of physics. The motion of the pendulum which physics is supposed to explain is a motion that no actual pendulum exhibits. What such cases point to is this: The fact that a strictly false description is in the right neighborhood sometimes advances understanding of a domain.

Catherine Elgin

Catherine Elgin thinks we should accept model claims when we consider them to be ‘true enough,’ and Uskali Mäki has argued in a similar vain, maintaining that it could be warranted — based on diverse pragmatic considerations — to accept model claims that are negligibly false.

Hmm …

When criticizing the basic (DSGE) workhorse model for its inability to explain involuntary unemployment, its defenders maintain that later elaborations — especially newer search models — manage to do just that. However, one of the more conspicuous problems with those “solutions,” is that they — as e.g. Pissarides’ ”Loss of Skill during Unemployment and the Persistence of Unemployment Shocks” QJE (1992) — are as a rule constructed without seriously trying to warrant that the model immanent assumptions and results are applicable in the real world. External validity is more or less a non-existent problematique sacrificed on the altar of model derivations. This is not by chance. For how could one even imagine to empirically test assumptions such as Pissarides’ ”model 1″ assumptions of reality being adequately represented by ”two overlapping generations of fixed size”, ”wages determined by Nash bargaining”, ”actors maximizing expected utility”,”endogenous job openings”, ”jobmatching describable by a probability distribution,” without coming to the conclusion that this is — in terms of realism and relevance — far from ‘negligibly false’ or ‘true enough’?

Suck on that — and tell me if those typical mainstream — neoclassical — modeling assumptions in any possibly relevant way — with or without due pragmatic considerations — can be considered anything else but imagined model worlds assumptions that has nothing at all to do with the real world we happen to live in!

Econometrics — fictions masquerading as science

13 Nov, 2015 at 17:14 | Posted in Statistics & Econometrics | Comments Off on Econometrics — fictions masquerading as science

rabbitIn econometrics one often gets the feeling that many of its practitioners think of it as a kind of automatic inferential machine: input data and out comes casual knowledge. This is like pulling a rabbit from a hat. Great — but first you have to put the rabbit in the hat. And this is where assumptions come in to the picture.

As social scientists — and economists — we have to confront the all-important question of how to handle uncertainty and randomness. Should we equate randomness with probability? If we do, we have to accept that to speak of randomness we also have to presuppose the existence of nomological probability machines, since probabilities cannot be spoken of – and actually, to be strict, do not at all exist – without specifying such system-contexts.

Accepting a domain of probability theory and a sample space of “infinite populations” — which is legion in modern econometrics — also implies that judgments are made on the basis of observations that are actually never made! Infinitely repeated trials or samplings never take place in the real world. So that cannot be a sound inductive basis for a science with aspirations of explaining real-world socio-economic processes, structures or events. It’s not tenable.

In his book Statistical Models and Causal Inference: A Dialogue with the Social Sciences David Freedman touches on this fundamental problem, arising when you try to apply statistical models outside overly simple nomological machines like coin tossing and roulette wheels:

statLurking behind the typical regression model will be found a host of such assumptions; without them, legitimate inferences cannot be drawn from the model. There are statistical procedures for testing some of these assumptions. However, the tests often lack the power to detect substantial failures. Furthermore, model testing may become circular; breakdowns in assumptions are detected, and the model is redefined to accommodate. In short, hiding the problems can become a major goal of model building.

Using models to make predictions of the future, or the results of interventions, would be a valuable corrective. Testing the model on a variety of data sets – rather than fitting refinements over and over again to the same data set – might be a good second-best … Built into the equation is a model for non-discriminatory behavior: the coefficient d vanishes. If the company discriminates, that part of the model cannot be validated at all.

Regression models are widely used by social scientists to make causal inferences; such models are now almost a routine way of demonstrating counterfactuals. However, the “demonstrations” generally turn out to depend on a series of untested, even unarticulated, technical assumptions. Under the circumstances, reliance on model outputs may be quite unjustified. Making the ideas of validation somewhat more precise is a serious problem in the philosophy of science. That models should correspond to reality is, after all, a useful but not totally straightforward idea – with some history to it. Developing appropriate models is a serious problem in statistics; testing the connection to the phenomena is even more serious …

In our days, serious arguments have been made from data. Beautiful, delicate theorems have been proved, although the connection with data analysis often remains to be established. And an enormous amount of fiction has been produced, masquerading as rigorous science.

Making outlandish statistical assumptions does not provide a solid ground for doing relevant social science.

Methodological foundations of heterodox economics

12 Nov, 2015 at 21:48 | Posted in Economics | Comments Off on Methodological foundations of heterodox economics

 

[h/t Jan Milch]

Read my lips — regression analysis does not imply causation

11 Nov, 2015 at 18:45 | Posted in Statistics & Econometrics | Comments Off on Read my lips — regression analysis does not imply causation

Many treatments of regression seem to take for granted that the investigator knows the relevant variables, their causal order, and the functional form of the relationships among them; measurements of the independent variables are assumed to be without error. Indeed, Gauss developed and used regression in physical science contexts where these conditions hold, at least to a very good approximation. Today, the textbook theorems that justify regression are proved on the basis of such assumptions.

In the social sciences, the situation seems quite different. Regression is used to discover relationships or to disentangle cause and effect.Ho wever, investigators have only vague ideas as to the relevant variables and their causal order; functional forms are chosen on the basis of convenience or familiarity; serious problems of measurement are often encountered.

bDhy4

Regression may offer useful ways of summarizing the data and making predictions. Investigators may be able to use summaries and predictions to draw substantive conclusions. However, I see no cases in which regression equations, let alone the more complex methods, have succeeded as engines for discovering causal relationships …

The larger problem remains. Can quantitative social scientists infer causality by applying statistical technology to correlation matrices? That is not a mathematical question, because the answer turns on the way the world is put together. As I read the record, correlational methods have not delivered the goods. We need to work on measurement, design, theory. Fancier statistics are not likely to help much.

David Freedman

If you only have time to study one mathematical statistician, the choice should be easy — David Freedman.

Why do I have a blog?

11 Nov, 2015 at 16:34 | Posted in Varia | Comments Off on Why do I have a blog?

 
strenght-quote

Mainstream economics — nothing but pseudo-scientific cheating

10 Nov, 2015 at 17:24 | Posted in Economics | 4 Comments

A common idea among mainstream — neoclassical — economists is the idea of science advancing through the use of ‘as if’  modeling assumptions and ‘successive approximations’. But is this really a feasible methodology? I think not.

Most models in science are representations of something else. Models “stand for” or “depict” specific parts of a “target system” (usually the real world).  All theories and models have to use sign vehicles to convey some kind of content that may be used for saying something of the target system. But purpose-built assumptions — like “rational expectations” or “representative actors” — made solely to secure a way of reaching deductively validated results in mathematical models, are of little value if they cannot be validated outside of the model.

60088455All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

The implications that follow from the kind of models that mainstream economists construct are always conditional on the simplifying assumptions used — assumptions predominantly of a rather far-reaching and non-empirical character with little resemblance to features of the real world. From a descriptive point of view there is a fortiori usually very little resemblance between the models used and the empirical world. *As if’ explanations building on such foundations are not really any explanations at all, since they always conditionally build on hypothesized law-like theorems and situation-specific restrictive assumptions. The empirical-descriptive inaccuracy of the models makes it more or less miraculous if they should — in any substantive way — be able to be considered explanative at all. If the assumptions that are made are known to be descriptively totally unrealistic (think of e.g. “rational expectations”) they are of course likewise totally worthless for making empirical inductions. Assuming that people behave ‘as if’ they were rational FORTRAN programmed computers doesn’t take us far when we know that the ‘if’ is false.

Theories are difficult to directly confront with reality. Economists therefore build models of their theories. Those models are representations that are directly examined and manipulated to indirectly say something about the target systems.

But models do not only face theory. They also have to look to the world. Being able to model a “credible world,” a world that somehow could be considered real or similar to the real world, is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

One could of course also ask for robustness, but the “as if worlds,” even after having tested it for robustness, can still be a far way from reality – and unfortunately often in ways we know are important. Robustness of claims in a model does not per se give a warrant for exporting the claims to real world target systems.

Anyway, robust theorems are exceedingly rare or non-existent in macroeconomics. Explanation, understanding and prediction of real world phenomena, relations and mechanisms therefore cannot be grounded (solely) on robustness analysis. Some of the standard assumptions made in neoclassical economic theory – on rationality, information handling and types of uncertainty – are not possible to make more realistic by de-idealization or successive approximations without altering the theory and its models fundamentally.

If we cannot show that the mechanisms or causes we isolate and handle in our models are stable, in the sense that what when we export them from are models to our target systems they do not change from one situation to another, then they only hold under ceteris paribus conditions and a fortiori are of limited value for our understanding, explanation and prediction of our real world target system.

The obvious shortcoming of a basically epistemic — rather than ontological — approach such as “successive approximations” and ‘as if’ modeling assumptions, is that “similarity” or “resemblance” tout court do not guarantee that the correspondence between model and target is interesting, relevant, revealing or somehow adequate in terms of mechanisms, causal powers, capacities or tendencies. No matter how many convoluted refinements of concepts made in the model, if the successive ‘as if’ approximations do not result in models similar to reality in the appropriate respects (such as structure, isomorphism, etc), they are nothing more than ‘substitute systems’ that do not bridge to the world but rather misses its target.

So, I have to conclude that constructing minimal macroeconomic ‘as if’ models or using microfounded macroeconomic models as “stylized facts” somehow “successively approximating” macroeconomic reality, is a rather unimpressive attempt at legitimizing using fictitious idealizations for reasons more to do with model tractability than with a genuine interest of understanding and explaining features of real economies. Many of the model assumptions standardly made by neoclassical macroeconomics are restrictive rather than harmless and could a fortiori anyway not in any sensible meaning be considered approximations at all.

Mainstream economics building on such a modeling strategy does not  produce science.

It’s nothing but pseudo-scientific cheating.

The thrust of this realist rhetoric is the same both at the scientific and at the meta-scientific levels. It is that explanatory virtues need not be evidential virtues. It is that you should feel cheated by “The world is as if T were true”, in the same way as you should feel cheated by “The stars move as if they were fixed on a rotating sphere”. Realists do feel cheated in both cases.

Alan Musgrave

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.