On testing and learning in a non-repetitive world

28 February, 2018 at 18:05 | Posted in Economics | 8 Comments

marquesThe incorporation of new information makes sense only if the future is to be similar to the past. Any kind of empirical test, whatever form it adopts, will not make sense, however, if the world is uncertain because in such a world induction does not work. Past experience is not a useful guide to guess the future in these conditions (it only serves when the future, somehow, is already implicit in the present) … I believe the only way to use past experience is to assume that the world is repetitive. In a non-repetitive world in which relevant novelties unexpectedly arise testing is irrelevant …

These considerations are applicable to decisions in conditions of radical uncertainty. If the actions that I undertake in t0 will have very different consequences according to the eventual state of the world in t1, it is crucial to gather reliable knowledge about these states. But how could I evaluate in t0 my beliefs about the state of the world in t1? If the world were repetitive (governed by immutable laws) and these laws were known, I could assume that what I find out about the present state is relevant to determine how the future state (the one that will prevail) will be. It would make then sense to apply a strategy for gathering empirical evidence (a sequence of actions to collect new data). But if the world is not repetitive, what makes me think that the new information may be at all useful regarding future events? …

Conceiving economic processes like sequences of events in which uncertainty reigns, where consequently there are “no laws”, nor “invariants” or “mechanisms” to discover, the kind of learning that experiments or last experience provide is of no use for the future, because it eliminates innovation and creativity and does not take into account the arboreal character and the open-ended nature of the economic process … However, as said before, we can gather precise information, restricted in space and time (data). But, what is the purpose of obtaining this sort of information if uncertainty about future events prevails? … The problem is that taking uncertainty seriously puts in question the relevance the data obtained by means of testing or experimentation has for future situations.

Marqués’ book is a serious challenge to much of mainstream economic thinking and its methodological and philosophical underpinnings. A must-read for anyone interested in the foundations of economic theory.

To yours truly, Marqués’ book is especially important since it shows how far-reaching the effects of taking Keynes’ concept of genuine uncertainty really are.

treatprobAlmost a hundred years after John Maynard Keynes wrote his seminal A Treatise on Probability (1921), it is still very difficult to find economics textbooks that seriously try to incorporate his far-reaching and incisive analysis of uncertainty, inductive inference and evidential weight.

The standard view in economics and statistics — and the axiomatic probability theory underlying it — is to a large extent based on the rather simplistic idea that ‘more is better.’ But as Keynes argues – ‘more of the same’ is not what is important when making inductive inferences. It’s rather a question of ‘more but different.’

Variation, not replication, is at the core of induction. Finding that p(x|y) = p(x|y & w) doesn’t make w ‘irrelevant.’ Knowing that the probability is unchanged when w is present gives p(x|y & w) another evidential weight (‘weight of argument’). Running 10 replicative experiments do not make you as ‘sure’ of your inductions as when running 10 000 varied experiments – even if the probability values happen to be the same.

According to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but ‘rational expectations.’ Keynes rather thinks that we base our expectations on the confidence or ‘weight’ we put on different events and alternatives. To Keynes, expectations are a question of weighing probabilities by ‘degrees of belief,’ beliefs that often have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents as modelled by ‘modern’ social sciences. And often we “simply do not know.” As Keynes writes in Treatise:

If different wholes were subject to different laws qua wholes and not simply on account of and in proportion to the differences of their parts, knowledge of a part could not lead, it would seem, even to presumptive or probable knowledge as to its association with other parts … In my judgment, the practical usefulness of those modes of inference … on which the boasted knowledge of modern science depends, can only exist … if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appears more and more clearly as the ultimate result to which material science is tending.

Science according to Keynes should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts.” Models can never be more than a starting point in that endeavour. He further argued that it was inadmissible to project history on the future. Consequently, we cannot presuppose that what has worked before, will continue to do so in the future. That statistical models can get hold of correlations between different ‘variables’ is not enough. If they cannot get at the causal structure that generated the data, they are not really ‘identified.’

How strange that writers of economics textbooks do not even touch upon these aspects of scientific methodology that seems to be so fundamental and important for anyone trying to understand how we learn and orient ourselves in an uncertain world. An educated guess on why this is a fact would be that Keynes concepts are not possible to squeeze into a single calculable numerical ‘probability.’ In the quest for quantities one puts a blind eye to qualities and looks the other way — but Keynes ideas keep creeping out from under the carpet.

Robert Lucas once wrote — in Studies in Business-Cycle Theory — that “in cases of uncertainty, economic reasoning will be of no value.”  Now, if that was true, it would put us in a tough dilemma. If we have to consider — as Lucas — uncertainty incompatible with economics being a science, and we actually know for sure that there are several and deeply important situations in real-world contexts where we — both epistemologically and ontologically — face genuine uncertainty, well, then we actually would have to choose between reality and science.

That can’t be right. We all know we do not know very much about the future. We all know the future harbours lots of unknown unknowns. Those are ontological facts we just have to accept. But — I still think it possible we can go for both reality and science, and develop a realist, relevant, non-ergodic economic science.

Advertisements

Var tog jämlikheten vägen?

28 February, 2018 at 10:48 | Posted in Politics & Society | 1 Comment

白领11Nya data från SCB visar att inkomstskillnaderna i Sverige har fortsatt öka. Gini-koefficienten som år 2005 låg på 0,27 har ökat till 0.32 år 2016. Det är den högsta noteringen sedan mätningarna startade.

 
Även andelen personer med “låg ekonomisk standard” har stigit från 10 procent år 2005 till 14 procent år 2016.

Idag har de 10 procent av befolkningen som har högst inkomster lika stor andel av den totala disponibla inkomsten som de 50 procent av befolkningen som har lägst inkomster.

Denna dystra bild av jämlikheten — i ett land som en gång i tiden var ett föredöme och det mest jämlika landet i världen — späs på ytterligare av data presenterade i en ny rapport från LO. I rapporten kan man visa på att den ekonomiska eliten — i LO:s datamaterial omfattande 50 verkställande direktörer på svenska storföretag — har en högre relativinkomst än någonsin tidigare. I genomsnitt ligger direktörernas  inkomster på en nivå motsvarande 55 industriarbetarlöner.

Vilken tur då att vi har en socialdemokratisk arbetarrörelseregering vid makten som gör allt vad den kan för att minska den accelererande ojämlikheten …

That Don’t Impress Me Much

26 February, 2018 at 17:20 | Posted in Varia | 7 Comments

 

The biggest trouble with modern​ macroeconomics

26 February, 2018 at 09:07 | Posted in Economics | Comments Off on The biggest trouble with modern​ macroeconomics

romer-paul_picThe trouble is not so much that macroeconomists say things that are inconsistent with the facts. The real trouble is that other economists do not care that the macroeconomists do not care about the facts. An indifferent tolerance of obvious error is even more corrosive to science than committed advocacy of error.

Paul Romer 

New-Classical-Real-Business-Cycles-DSGE-New-Keynesian microfounded macromodels try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality.

Opting for cloned representative agents that are all identical is of course not a real solution for analyzing macroeconomic issues. Representative agent models are — as I have argued at length here — rather an evasion whereby issues of distribution, coordination, heterogeneity — everything that really defines macroeconomics — are swept under the rug.

Of course, most macroeconomists know that to use a representative agent is a flagrantly illegitimate method of ignoring real aggregation issues. They keep on with their business, nevertheless, just because it significantly simplifies what they are doing.

Continuing to model a world full of agents behaving as economists — ‘often wrong, but never uncertain’ — is a gross misallocation of intellectual resources and time.

Keynes — en sällsynt fågel

25 February, 2018 at 18:04 | Posted in Economics | Comments Off on Keynes — en sällsynt fågel

davAlfred Marshall skrev en gång att “bra ekonomer är sällsynta fåglar.” Det stämmer verkligen. En av dessa sällsynta fåglar var definitivt John Maynard Keynes.

Och nu har den norske ekonomiprofessorn Björn-Ivar Davidsen skrivit en bok om denne sällsynt kompetente ekonom, filosof, statstjänsteman, spekulant, utopist, universitetsmecenat, konstsamlare, med mera, med mera.

I boken får vi följa med på en resa i Keynes’ biografiska fotspår och får på så vis ta del av denne mångsidige intellektuelle gigants liv och leverne. Även om boken främst är biografiskt hållen, finns det, inte minst i de avslutande kapitlen, intressanta och välargumenterade resonemang kring vad som är kärnan i Keynes ekonomiska tänkande.

Nyttig läsning — inte minst för de som kanske läst lite nationalekonomi på något av våra universitet och därför tyvärr fått en fullständigt förvrängd bild av vad som var det omvälvande och revolutionerande i Keynes’ ekonomiska tänkande.

Med den här boken visar författaren att han själv också är en sällsynt fågel. Skrivkunniga akademiska ekonomer är en utrotningshotad art. Vilken tur att vi då kan läsa denna välskrivna och intressanta bok, skriven av en ekonom som uppenbart klarar av att skriva skön prosa.

Tag och läs!

Vingar

25 February, 2018 at 17:13 | Posted in Varia | Comments Off on Vingar

 

Models and economics

23 February, 2018 at 23:10 | Posted in Economics | 4 Comments

Economics is a science of thinking in terms of models joined to the art of choosing models which are relevant to the contemporary world. It is compelled to be this, because, unlike the typical natural science, the material to which it is applied is, in too many respects, not homogeneous through time. The object of a model is to segregate the semi-permanent or relatively constant factors from those which are transitory or fluctuating so as to develop a logical way of thinking about the latter, and of understanding the time sequences to which they give rise in particular cases … Good economists are scarce because the gift for using “vigilant observation” to choose good models, although it does not require a highly specialised intellectual technique, appears to be a very rare one.

John Maynard Keynes (letter to Harrod, 1938)

Hit and run

23 February, 2018 at 21:08 | Posted in Varia | Comments Off on Hit and run

 

Keeping the dream alive

23 February, 2018 at 16:25 | Posted in Economics | 1 Comment

akerlof_photo_01For me, the study of asymmetric information was a very first step toward the realization of a dream. That dream was the development of a behavioral macroeconomics in the original spirit of Keynes’ General Theory. Macroeconomics would then no longer suffer from the ad hockery of the neoclassical synthesis, which had over-ridden the emphasis in The General Theory on the role of psychological and sociological factors, such as cognitive bias, reciprocity, fairness, herding, and social status. My dream was to strengthen macroeconomic theory by incorporating assumptions honed to the observation of such behavior …

Keynes’ General Theory was the greatest contribution to behavioral economics before the present era. Almost everywhere Keynes blamed market failures on psychological propensities (as in consumption) and irrationalities (as in stock market speculation). Immediately after its publication, the economics profession tamed Keynesian economics. They domesticated it as they translated it into the “smooth” mathematics of classical economics. But economies, like lions, are wild and dangerous. Modern behavioral economics has rediscovered the wild side of macroeconomic behavior. Behavioral economists are becoming lion tamers. The task is as intellectually exciting as it is difficult.

George Akerlof

Keynes’ core in​sight

23 February, 2018 at 11:15 | Posted in Economics | Comments Off on Keynes’ core in​sight

rBut these more recent writers like their predecessors were still dealing with a system in which the amount of the factors employed was given and the other relevant facts were known more or less for certain … At any given time facts and expectations were assumed to be given in a definite and calculable form … The calculus of probability, tho mention of it was kept in the background, was supposed to be capable of reducing uncertainty to the same calculable status as that of certainty itself …

The fact that our knowledge of the future is fluctuating, vague and uncertain, renders Wealth a peculiarly unsuitable subject for the methods of the classical economic theory …

By “uncertain” knowledge, let me explain, I do not mean merely to distinguish what is known for certain from what is only probable … The sense in which I am using the term is that in which the prospect of a European war is uncertain, or the price of copper and the rate of interest twenty years hence … About these matters there is no scientific basis on which to form any calculable probability whatever. We simply do not know.

John Maynard Keynes

To understand real world ‘non-routine’ decisions and unforeseeable changes in behaviour, ergodic probability distributions are of no avail. In a world full of genuine uncertainty – where real historical time rules the roost – the probabilities that ruled the past cannot simply be assumed to be those that will rule the future.

Time is what prevents everything from happening at once. To assume that economic processes are ergodic and concentrate on ‘ensemble averages’ is not a sensible way for dealing with the kind of genuine uncertainty that permeates real-world economies.

What is important with the fact that real social and economic processes are nonergodic is the fact that uncertainty – not risk – rules the roost. Thinking about uncertainty in terms of ‘rational expectations’ and ‘ensemble averages’ has had seriously bad repercussions on the financial system.

Keynes’ uncertainty concept has an ontological founding. Of course this also has repercussions on the issue of ergodicity in a strict methodological and mathematical-statistical sense.

The most interesting and far-reaching difference between an epistemological and an ontological view on uncertainty is that if one subscribes to the former, you open up for the mistaken belief that with better information and greater computer-power we somehow should always be able to calculate probabilities and describe the world as an ergodic universe. As Keynes convincingly argued, that is ontologically just not possible.

To Keynes, the source of uncertainty is in the nature of the real – nonergodic – world. It has to do not primarily with the epistemological fact of us not knowing the things that today are unknown, but rather with the much deeper and far-reaching ontological fact that there often is no firm basis on which we can form quantifiable probabilities and expectations at all.

We have to accept that if we really want to be able to understand and analyze real-world phenomena we have to accept them on their own premisses. Our quest for knowledge should never decide how to perceive reality.

The most important and far-reaching premiss on which modern mainstream economics builds is the assumption that genuine uncertainty is reducible to calculable risk. Since this is not the case, modern mainstream economics is also totally useless.

Take the rational expectations assumption. Rational expectations in the mainstream economists’ world imply that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality, it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis, we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents make ‘predictable errors’ in DSGE models doesn’t take us any closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis and calculable risk with more relevant and realistic assumptions concerning uncertainty than childish roulette and urn analogies. Or as Quetelet once declared — “l’urne que nous interrogeons, c’est la nature.”

Courage that will live on forever

22 February, 2018 at 18:02 | Posted in Politics & Society | 4 Comments

 

Seventy-five years ago, on February 22, 1943, Hans and Sophie Scholl — members of the resistance group ‘Die Weisse Rose’ — were killed by the Nazis.

Courage is not anything very common, and the value we put on it is a witness to its rarity.

Courage is a capability to confront fear, as when in front of the powerful and mighty, not to step back, but stand up for one’s rights not to be humiliated or abused in any ways by the rich and powerful.

Courage is to do the right thing in spite of danger and fear. To keep on even if opportunities to turn back are given. Like in the great stories. The ones where people have lots of chances of turning back — but don’t.

Dignity, a better life, or justice and rule of law, are things worth fighting for. Not to step back, in spite of confronting the mighty and powerful, creates courageous acts that stay in our memories and means something – as when Hans and Sophie Scholl decided to fight the Nazi atrocities. May their beautiful souls live on forever.

Economics — a science with wacky views of human behaviour​

21 February, 2018 at 17:08 | Posted in Economics | 4 Comments

There is something about the way economists construct their models nowadays that obviously doesn’t sit right.

The one-sided, almost religious, insistence on axiomatic-deductivist modelling as the only scientific activity worthy of pursuing in economics still has not given way to methodological pluralism based on ontological considerations (rather than formalistic tractability). In their search for model-based rigour and certainty, ‘modern’ economics has turned out to be a totally hopeless project in terms of real-world relevance.

grumpy-economics-catIf macroeconomic models — no matter of what ilk — build on microfoundational assumptions of representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that model-based conclusions or hypotheses of causally relevant mechanisms or regularities can be bridged to real-world target systems, are obviously non-justifiable. Incompatibility between actual behaviour and the behaviour in macroeconomic models building on representative actors and rational expectations microfoundations shows the futility of trying to represent real-world target systems with models flagrantly at odds with reality. As Robert Gordon once had it:

Rigor competes with relevance in macroeconomic and monetary theory, and in some lines of development macro and monetary theorists, like many of their colleagues in micro theory, seem to consider relevance to be more or less irrelevant.

Science and the quest for truth

20 February, 2018 at 19:01 | Posted in Economics | 4 Comments

28mptoothfairy_jpg_1771152eIn my view, scientific theories are not to be considered ‘true’ or ‘false.’ In constructing such a theory, we are not trying to get at the truth, or even to approximate to it: rather, we are trying to organize our thoughts and observations in a useful manner.

Robert Aumann

 

What a handy view of science.

How reassuring for all of you who have always thought that believing in the tooth fairy make you understand what happens to kids’ teeth. Now a ‘Nobel prize’ winning economist tells you that if there are such things as tooth fairies or not doesn’t really matter. Scientific theories are not about what is true or false, but whether ‘they enable us to organize and understand our observations’ …

Mirabile dictu!

What Aumann and other defenders of scientific storytelling ‘forgets’ is that potential explanatory power achieved in thought experimental models is not enough for attaining real explanations. Model explanations are at best conjectures, and whether they do or do not explain things in the real world is something we have to test. To just believe that you understand or explain things better with thought experiments is not enough. Without a warranted export certificate to the real world, model explanations are pretty worthless. Proving things in models is not enough. Truth is an important concept in real science.

Marx and Keynes on the contradictions of capitalism

19 February, 2018 at 14:56 | Posted in Economics | 8 Comments

elsterEach capitalist, Marx noted, has an ambiguous relation to the workers. On the one hand, she wants the workers she employs to have low wages, since that makes for high profits. On the other hand, she wants all other workers to have high wages, since that makes for high demand for her products. Although it is possible for any one capitalist to have both desires satisfied, it is logically impossible for this to be the case for all capitalists simultaneously. This is a ‘contradiction of capitalism’ that Keynes spelled out as follows. In a situation of falling profit, each capitalist responds by laying off workers, thus saving on the wage bill. Yet since the demand of workers directly or indirectly is what sustains the firm, the effect of all capitalists’ simultaneously laying off workers will be a further reduction in profit, causing more lay-offs or bankruptcies.

Flight From The City

19 February, 2018 at 13:55 | Posted in Varia | Comments Off on Flight From The City

 

Jóhann Jóhannsson (1969 – 2018)  R.I.P.

The future — something we know very little about

18 February, 2018 at 19:54 | Posted in Economics | 1 Comment

All these pretty, polite techniques, made for a well-panelled Board Room and a nicely regulated market, are liable to collapse. At all times the vague panic fears and equally vague and unreasoned hopes are not really lulled, and lie but a little way below the surface.

check-your-assumptionsPerhaps the reader feels that this general, philosophical disquisition on the behavior of mankind is somewhat remote from the economic theory under discussion. But I think not. Tho this is how we behave in the marketplace, the theory we devise in the study of how we behave in the market place should not itself submit to market-place idols. I accuse the classical economic theory of being itself one of these pretty, polite techniques which tries to deal with the present by abstracting from the fact that we know very little about the future.

I dare say that a classical economist would readily admit this. But, even so, I think he has overlooked the precise nature of the difference which his abstraction makes between theory and practice, and the character of the fallacies into which he is likely to be led.

John Maynard Keynes

Poland’s Law and Justice — now and then

18 February, 2018 at 15:59 | Posted in Politics & Society | 2 Comments

A new law passed by Poland’s ruling Law and Justice Party and signed by President Andrzej Duda on Feb. 6, means that you may end up in prison for three years if you “publicly and against the facts attribute to the Polish nation or the Polish state responsibility or co-responsibility for Nazi crimes committed by the German Third Reich.”

JedwabneThe Polish Parliament ordered a new investigation into the Jedwabne atrocity in July 2000 … Over the course of two years, investigators from the Polish Institute of National Remembrance (IPN) interviewed some 111 witnesses … On July 9, 2002, IPN released the final findings of its two-year-long investigation. In a carefully worded summary IPN stated its principal conclusions as follows:

The perpetrators of the crime sensu stricto were Polish inhabitants of Jedwabne and its environs; responsibility for the crime sensu largo could be ascribed to the Germans. IPN found that Poles played a “decisive role” in the massacre, but the massacre was “inspired by the Germans”. The massacre was carried out in full view of the Germans, who were armed and had control of the town, and the Germans refused to intervene and halt the killings. IPN wrote: “The presence of German military policemen…..and other uniformed Germans…..was tantamount to consent to, and tolerance of, the crime.”

Wikipedia

China concerts

16 February, 2018 at 22:02 | Posted in Varia | Comments Off on China concerts

 

The Bayesian folly

16 February, 2018 at 18:18 | Posted in Economics | 1 Comment

Assume you’re a Bayesian turkey and hold a nonzero probability belief in the hypothesis H that “people are nice vegetarians that do not eat turkeys and that every day I see the sun rise confirms my belief.” For every day you survive, you update your belief according to Bayes’ Rule

P(H|e) = [P(e|H)P(H)]/P(e),

where evidence e stands for “not being eaten” and P(e|H) = 1. Given that there do exist other hypotheses than H, P(e) is less than 1 and so P(H|e) is greater than P(H). Every day you survive increases your probability belief that you will not be eaten. This is totally rational according to the Bayesian definition of rationality. Unfortunately — as Bertrand Russell famously noticed — for every day that goes by, the traditional Christmas dinner also gets closer and closer …

Neoclassical economics nowadays usually assumes that agents that have to make choices under conditions of uncertainty behave according to Bayesian rules — that is, they maximize expected utility with respect to some subjective probability measure that is continually updated according to Bayes theorem. If not, they are supposed to be irrational.

bayes_dog_tshirtBayesianism reduces questions of rationality to questions of internal consistency (coherence) of beliefs, but — even granted this questionable reductionism — do rational agents really have to be Bayesian? As I have been arguing repeatedly over the years, there is no strong warrant for believing so.

The nodal point here is — of course — that although Bayes’ Rule is mathematically unquestionable, that doesn’t qualify it as indisputably applicable to scientific questions. As one of my favourite statistics bloggers —  Andrew Gelman — puts it:

The fundamental objections to Bayesian methods are twofold: on one hand, Bayesian methods are presented as an automatic inference engine, and this raises suspicion in anyone with applied experience, who realizes that different methods work well in different settings … Bayesians promote the idea that a multiplicity of parameters can be handled via hierarchical, typically exchangeable, models, but it seems implausible that this could really work automatically. In contrast, much of the work in modern non-Bayesian statistics is focused on developing methods that give reasonable answers using minimal assumptions.

The second objection to Bayes comes from the opposite direction and addresses the subjective strand of Bayesian inference: the idea that prior and posterior distributions represent subjective states of knowledge. Here the concern from outsiders is, first, that as scientists we should be concerned with objective knowledge rather than subjective belief, and second, that it’s not clear how to assess subjective knowledge in any case.

bayesfunBeyond these objections is a general impression of the shoddiness of some Bayesian analyses, combined with a feeling that Bayesian methods are being oversold as an all-purpose statistical solution to genuinely hard problems. Compared to classical inference, which focuses on how to extract the information available in data, Bayesian methods seem to quickly move to elaborate computation. It does not seem like a good thing for a generation of statistics to be ignorant of experimental design and analysis of variance, instead of becoming experts on the convergence of the Gibbs sampler. In the short term this represents a dead end, and in the long term it represents a withdrawal of statisticians from the deeper questions of inference and an invitation for econometricians, computer scientists, and others to move in and fill in the gap …

Bayesian inference is a coherent mathematical theory but I don’t trust it in scientific applications. Subjective prior distributions don’t transfer well from person to person, and there’s no good objective principle for choosing a noninformative prior (even if that concept were mathematically defined, which it’s not). Where do prior distributions come from, anyway? I don’t trust them and I see no reason to recommend that other people do, just so that I can have the warm feeling of philosophical coherence …

As Brad Efron wrote in 1986, Bayesian theory requires a great deal of thought about the given situation to apply sensibly, and recommending that scientists use Bayes’ theorem is like giving the neighborhood kids the key to your F-16 …

Economics education — teaching cohorts after cohorts of students useless theories

15 February, 2018 at 20:17 | Posted in Economics | 4 Comments

Nowadays there is almost no place whatsoever in economics education for courses in the history of economic thought and economic methodology.

This is deeply worrying.

A science that doesn’t self-reflect and asks important methodological and science-theoretical questions about the own activity, is a science in dire straits.

How did we end up in this sad state?

Philip Mirowski gives the following answer:

phil After a brief flirtation in the 1960s and 1970s, the grandees of the economics profession took it upon themselves to express openly their disdain and revulsion for the types of self-reflection practiced by ‘methodologists’ and historians of economics, and to go out of their way to prevent those so inclined from occupying any tenured foothold in reputable economics departments. It was perhaps no coincidence that history and philosophy were the areas where one found the greatest concentrations of skeptics concerning the shape and substance of the post-war American economic orthodoxy. High-ranking economics journals, such as the American Economic Review, the Quarterly Journal of Economics and the Journal of Political Economy, declared that they would cease publication of any articles whatsoever in the area, after a prior history of acceptance.

Once this policy was put in place, and then algorithmic journal rankings were used to deny hiring and promotion at the commanding heights of economics to those with methodological leanings. Consequently, the grey-beards summarily expelled both philosophy and history from the graduate economics curriculum; and then, they chased it out of the undergraduate curriculum as well. This latter exile was the bitterest, if only because many undergraduates often want to ask why the profession believes what it does, and hear others debate the answers, since their own allegiances are still in the process of being formed. The rationale tendered to repress this demand was that the students needed still more mathematics preparation, more statistics and more tutelage in ‘theory’, which meant in practice a boot camp regimen consisting of endless working of problem sets, problem sets and more problem sets, until the poor tyros were so dizzy they did not have the spunk left to interrogate the masses of journal articles they had struggled to absorb.

Methodology is about how we do economics, how we evaluate theories, models and arguments. To know and think about methodology is important for every economist. Without methodological awareness it’s really impossible to understand what you are doing and why you’re doing it. Dismissing methodology is dismissing a necessary and vital part of science.

Already back in 1991, a commission chaired by Anne Krueger and including people like Kenneth Arrow, Edward Leamer, and Joseph Stiglitz, reported from own experience “that it is an underemphasis on the ‘linkages’ between tools, both theory and econometrics, and ‘real world problems’ that is the weakness of graduate education in economics,” and that both students and faculty sensed “the absence of facts, institutional information, data, real-world issues, applications, and policy problems.” And in conclusion, they wrote that “graduate programs may be turning out a generation with too many idiot savants skilled in technique but innocent of real economic issues.”

Not much is different today. Economics — and economics education — is still in dire need of a remake.

Twenty-five years ago, Phil Mirowski was invited to give a speech on themes from his book More Heat than Light at my economics department in Lund, Sweden. All the mainstream neoclassical professors were there. Their theories were totally mangled and no one — absolutely no one — had anything to say even remotely reminiscent of a defence. Being at a nonplus, one of them, in total desperation, finally asked: “But what shall we do then?”

rethinkYes indeed — what shall they do when their emperor has turned out to be naked?

More and more young economics students want to see a real change in economics and the way it’s taught. They want something other than the same old mainstream neoclassical catechism. They don’t want to be force-fed with useless mainstream neoclassical theories and models.

Ask the mountains

14 February, 2018 at 08:47 | Posted in Economics, Varia | Comments Off on Ask the mountains

 

The problem of extrapolation

14 February, 2018 at 00:01 | Posted in Theory of Science & Methodology | 8 Comments

steelThere are two basic challenges that confront any account of extrapolation that seeks to resolve the shortcomings of simple induction. One challenge, which I call extrapolator’s circle, arises from the fact that extrapolation is worthwhile only when there are important limitations on what one can learn about the target by studying it directly. The challenge, then, is to explain how the suitability of the model as a basis for extrapolation can be established given only limited, partial information about the target … The second challenge is a direct consequence of the heterogeneity of populations studied in biology and social sciences. Because of this heterogeneity, it is inevitable there will be causally relevant differences between the model and the target population.

In economics — as a rule — we can’t experiment on the real-world target directly.  To experiment, economists therefore standardly construct ‘surrogate’ models and perform ‘experiments’ on them. To be of interest to us, these surrogate models have to be shown to be relevantly ‘similar’ to the real-world target, so that knowledge from the model can be exported to the real-world target. The fundamental problem highlighted by Steel is that this ‘bridging’ is deeply problematic​ — to show that what is true of the model is also true of the real-world target, we have to know what is true of the target, but to know what is true of the target we have to know that we have a good model  …

Most models in science are representations of something else. Models “stand for” or “depict” specific parts of a “target system” (usually the real world). A model that has neither surface nor deep resemblance to important characteristics of real economies ought to be treated with prima facie suspicion. How could we possibly learn about the real world if there are no parts or aspects of the model that have relevant and important counterparts in the real world target system? The burden of proof lays on the theoretical economists thinking they have contributed anything of scientific relevance without even hinting at any bridge enabling us to traverse from model to reality. All theories and models have to use sign vehicles to convey some kind of content that may be used for saying something of the target system. But purpose-built tractability assumptions — like, e. g., invariance, additivity, faithfulness, modularity, common knowledge, etc., etc. — made solely to secure a way of reaching deductively validated results in mathematical models, are of little value if they cannot be validated outside of the model.

All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is (no longer) the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

Theories are difficult to directly confront with reality. Economists therefore build models of their theories. Those models are representations that are directly examined and manipulated to indirectly say something about the target systems.

There are economic methodologists and philosophers that argue for a less demanding view on modeling and theorizing in economics. And to some theoretical economists it is deemed quite enough to consider economics as a mere “conceptual activity” where the model is not so much seen as an abstraction from reality, but rather a kind of “parallel reality”. By considering models as such constructions, the economist distances the model from the intended target, only demanding the models to be credible, thereby enabling him to make inductive inferences to the target systems.

But what gives license to this leap of faith, this “inductive inference”? Within-model inferences in formal-axiomatic models are usually deductive, but that does not come with a warrant of reliability for inferring conclusions about specific target systems. Since all models in a strict sense are false (necessarily building in part on false assumptions) deductive validity cannot guarantee epistemic truth about the target system. To argue otherwise would surely be an untenable overestimation of the epistemic reach of surrogate models.

Models do not only face theory. They also have to look to the world. But being able to model a credible world, a world that somehow could be considered real or similar to the real world, is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified (in terms of resemblance, relevance etc). At the very least, the minimalist demand on models in terms of credibility has to give away to a stronger epistemic demand of appropriate similarity and plausibility. One could of course also ask for a sensitivity or robustness analysis, but the credible world, even after having tested it for sensitivity and robustness, can still be a far way from reality – and unfortunately often in ways we know are important. Robustness of claims in a model does not per se give a warrant for exporting the claims to real world target systems.

Questions of external validity — the claims the extrapolation inference is supposed to deliver — are important. It can never be enough that models somehow are regarded as internally consistent. One always also has to pose questions of consistency with the data. Internal consistency without external validity is worth nothing.

The arrow of time in a non-ergodic world

13 February, 2018 at 09:00 | Posted in Theory of Science & Methodology | 3 Comments

an end of certaintyFor the vast majority of scientists, thermodynamics had to be limited strictly to equilibrium. That was the opinion of J. Willard Gibbs, as well as of Gilbert N. Lewis. For them, irreversibility associated with unidirectional time was anathema …

I myself experienced this type of hostility in 1946 … After I had presented my own lecture on irreversible thermodynamics, the greatest expert in the field of thermodynamics made the following comment: ‘I am astonished that this young man is so interested in nonequilibrium physics. Irreversible processes are transient. Why not wait and study equilibrium as everyone else does?’ I was so amazed at this response that I did not have the presence of mind to answer: ‘But we are all transient. Is it not natural to be interested in our common human condition?’

Time is what prevents everything from happening at once. To simply assume that economic processes are ergodic and concentrate on ensemble averages — and hence in any relevant sense timeless — is not a sensible way for dealing with the kind of genuine uncertainty that permeates real-world economies.

Ergodicity and the all-important difference between time averages and ensemble averages are difficult concepts — so let me try to explain the meaning of these concepts by means of a couple of simple examples.

Let’s say you’re offered a gamble where on a roll of a fair die you will get €10  billion if you roll a six, and pay me €1 billion if you roll any other number.

Would you accept the gamble?

If you’re an economics student​ you probably would, because that’s what you’re taught to be the only thing consistent with being rational. You would arrest the arrow of time by imagining six different “parallel universes” where the independent outcomes are the numbers from one to six, and then weight them using their stochastic probability distribution. Calculating the expected value of the gamble – the ensemble average – by averaging on all these weighted outcomes you would actually be a moron if you didn’t take the gamble (the expected value of the gamble being 5/6*€0 + 1/6*€10 billion = €1.67 billion)

If you’re not an economist you would probably trust your common sense and decline the offer, knowing that a large risk of bankrupting one’s economy is not a very rosy perspective for the future. Since you can’t really arrest or reverse the arrow of time, you know that once you have lost the €1 billion, it’s all over. The large likelihood that you go bust weights heavier than the 17% chance of you becoming enormously rich. By computing the time average – imagining one real universe where the six different but dependent outcomes occur consecutively – we would soon be aware of our assets disappearing, and a fortiori that it would be irrational to accept the gamble.

Why is the difference between ensemble and time averages of such importance in economics? Well, basically, because when assuming the processes to be ergodic, ensemble and time averages are identical.

Assume we have a market with an asset priced at €100.​ Then imagine the price first goes up by 50% and then later falls by 50%. The ensemble average for this asset would be €100 – because we here envision two parallel universes (markets) where the asset price​ falls in one universe (market) with 50% to €50, and in another universe (market) it goes up with 50% to €150, giving an average of 100 € ((150+50)/2). The time average for this asset would be 75 € – because we here envision one universe (market) where the asset price first rises by 50% to €150, and then falls by 50% to €75 (0.5*150).

From the ensemble perspective nothing really, on average, happens. From the time perspective lots of things really, on average, happen. Assuming ergodicity there would have been no difference at all.

On a more economic-theoretical level, ​the difference between ensemble and time averages also highlights the problems concerning the neoclassical theory of expected utility.

When applied to the neoclassical theory of expected utility, one thinks in terms of “parallel universe” and asks what is the expected return of an investment, calculated as an average over the “parallel universe”? In our coin tossing example, it is as if one supposes that various “I” are tossing a coin and that the loss of many of them will be offset by the huge profits one of these “I” does. But this ensemble average does not work for an individual, for whom a time average better reflects the experience made in the “non-parallel universe” in which we live.

Time averages give​ a more realistic answer, where one thinks in terms of the only universe we actually live in, and ask what is the expected return of an investment, calculated as an average over time.

Since we cannot go back in time – entropy and the arrow of time make this impossible – and the bankruptcy option is always at hand (extreme events and “black swans” are always possible) we have nothing to gain from thinking in terms of ensembles.

Actual events follow a fixed pattern of time, where events are often linked in a multiplicative process (as e. g. investment returns with “compound interest”) which is basically non-ergodic.

Instead of arbitrarily assuming that people have a certain type of utility function – as in the neoclassical theory – time average considerations show that we can obtain a less arbitrary and more accurate picture of real people’s decisions and actions by basically assuming that time is irreversible. When are assets are gone, they are gone. The fact that in a parallel universe it could conceivably have been refilled, is​ of little comfort to those who live in the one and only possible world that we call the real world.

Our coin toss example can be applied to more traditional economic issues. If we think of an investor, we can basically describe his situation in terms of our coin toss. What fraction of his assets should an investor – who is about to make a large number of repeated investments – bet on his feeling that he can better evaluate an investment (p = 0.6) than the market (p = 0.5)? The greater the fraction, the greater is the leverage. But also – the greater is the risk. Letting p be the probability that his investment valuation is correct and (1 – p) is the probability that the market’s valuation is correct, it means that he optimizes the rate of growth on his investments by investing a fraction of his assets that is equal to the difference in the probability that he will “win” or “lose”. This means that he at each investment opportunity (according to the so-called Kelly criterion) is to invest the fraction of  0.6 – (1 – 0.6), i.e. about 20% of his assets (and the optimal average growth rate of investment can be shown to be about 2% (0.6 log (1.2) + 0.4 log (0.8))).

Time average considerations show that because we cannot go back in time, we should not take excessive risks. High leverage increases the risk of bankruptcy. This should also be a warning for the financial world, where the constant quest for greater and greater leverage – and risks – creates extensive and recurrent systemic crises. A more appropriate level of risk-taking is a necessary ingredient in a policy to come to curb excessive risk-taking​.

To understand real world “non-routine” decisions and unforeseeable changes in behaviour, ergodic probability distributions are of no avail. In a world full of genuine uncertainty — where real historical time rules the roost — the probabilities that ruled the past are not necessarily those that will rule the future.

Irreversibility can no longer be identified with a mere appearance​ that would disappear if we had perfect knowledge … Figuratively speaking, matter at equilibrium, with no arrow of time, is ‘blind,’ but with the arrow of time, it begins to ‘see’ … The claim that the arrow of time is ‘only phenomenological​l,’ or subjective, is therefore absurd. We are actually the children of the arrow of time, of evolution, not its progenitors.

Ilya Prigogine

So​ what you’re saying is …

13 February, 2018 at 08:55 | Posted in Politics & Society | 7 Comments

 

Trump on Britain’s NHS

13 February, 2018 at 08:17 | Posted in Politics & Society | 1 Comment

 

How central banks create money

12 February, 2018 at 11:10 | Posted in Economics | 5 Comments

shake the money tree
Have you ever asked yourself how central banks create money? BBC Radio gives the answer.

The limits of probabilistic reasoning

12 February, 2018 at 09:48 | Posted in Statistics & Econometrics | 8 Comments

Probabilistic reasoning in science — especially Bayesianism — reduces questions of rationality to questions of internal consistency (coherence) of beliefs, but, even granted this questionable reductionism, it’s not self-evident that rational agents really have to be probabilistically consistent. There is no strong warrant for believing so. Rather, there is strong evidence for us encountering huge problems if we let probabilistic reasoning become the dominant method for doing research in social sciences on problems that involve risk and uncertainty.

probIn many of the situations that are relevant to economics, one could argue that there is simply not enough of adequate and relevant information to ground beliefs of a probabilistic kind and that in those situations it is not possible, in any relevant way, to represent an individual’s beliefs in a single probability measure.

Say you have come to learn (based on own experience and tons of data) that the probability of you becoming unemployed in Sweden is 10%. Having moved to another country (where you have no own experience and no data) you have no information on unemployment and a fortiori nothing to help you construct any probability estimate on. A Bayesian would, however, argue that you would have to assign probabilities to the mutually exclusive alternative outcomes and that these have to add up to 1 if you are rational. That is, in this case – and based on symmetry – a rational individual would have to assign probability 10% to become unemployed and 90% to become employed.

That feels intuitively wrong though, and I guess most people would agree. Bayesianism cannot distinguish between symmetry-based probabilities from information and symmetry-based probabilities from an absence of information. In these kinds of situations, most of us would rather say that it is simply irrational to be a Bayesian and better instead to admit that we “simply do not know” or that we feel ambiguous and undecided. Arbitrary an ungrounded probability claims are more irrational than being undecided in face of genuine uncertainty, so if there is not sufficient information to ground a probability distribution it is better to acknowledge that simpliciter, rather than pretending to possess a certitude that we simply do not possess.

I think this critique of Bayesianism is in accordance with the views of John Maynard Keynes’ A Treatise on Probability (1921) and General Theory (1937). According to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but rational expectations. Sometimes we “simply do not know.” Keynes would not have accepted the view of Bayesian economists, according to whom expectations “tend to be distributed, for the same information set, about the prediction of the theory.” Keynes, rather, thinks that we base our expectations on the confidence or ‘weight’ we put on different events and alternatives. To Keynes expectations are a question of weighing probabilities by ‘degrees of belief,’ beliefs that have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents modelled by probabilistically reasoning Bayesian economists.

We always have to remember that economics and statistics are two quite different things, and as long as economists cannot identify their statistical theories with real-world phenomena there is no real warrant for taking their statistical inferences seriously.

Just as there is no such thing as a ‘free lunch,’ there is no such thing as a ‘free probability.’ To be able at all to talk about probabilities, you have to specify a model. If there is no chance set-up or model that generates the probabilistic outcomes or events -– in statistics one refers to any process where you observe or measure as an experiment (rolling a die) and the results obtained as the outcomes or events (number of points rolled with the die, being e. g. 3 or 5) of the experiment -– there, strictly seen, is no event at all.

Probability is a relational element. It always must come with a specification of the model from which it is calculated. And then to be of any empirical scientific value it has to be shown to coincide with (or at least converge to) real data generating processes or structures –- something seldom or never done in economics.

And this is the basic problem!

If you have a fair roulette-wheel, you can arguably specify probabilities and probability density distributions. But how do you conceive of the analogous ‘nomological machines’ for prices, gross domestic product, income distribution etc? Only by a leap of faith. And that does not suffice in science. You have to come up with some really good arguments if you want to persuade people into believing in the existence of socio-economic structures that generate data with characteristics conceivable as stochastic events portrayed by probabilistic density distributions! Not doing that, you simply conflate statistical and economic inferences.

The present ‘machine learning’ and ‘big data’ hype shows that many social scientists — falsely — think that they can get away with analysing real-world phenomena without any (commitment to) theory. But — data never speaks for itself. Without a prior statistical set-up, there actually are no data at all to process. And — using a machine learning algorithm will only produce what you are looking for. Theory matters.

Causality in social sciences — and economics — can never solely be a question of statistical inference. Causality entails more than predictability, and to really in-depth explain social phenomena require theory. Analysis of variation — the foundation of all econometrics — can never in itself reveal how these variations are brought about. First when we are able to tie actions, processes or structures to the statistical relations detected, can we say that we are getting at relevant explanations of causation.
5cd674ec7348d0620e102a79a71f0063Most facts have many different, possible, alternative explanations, but we want to find the best of all contrastive (since all real explanation takes place relative to a set of alternatives) explanations. So which is the best explanation? Many scientists, influenced by statistical reasoning, think that the likeliest explanation is the best explanation. But the likelihood of x is not in itself a strong argument for thinking it explains y. I would rather argue that what makes one explanation better than another are things like aiming for and finding powerful, deep, causal, features and mechanisms that we have warranted and justified reasons to believe in. Statistical — especially the variety based on a Bayesian epistemology — reasoning generally has no room for these kinds of explanatory considerations. The only thing that matters is the probabilistic relation between evidence and hypothesis. That is also one of the main reasons I find abduction — inference to the best explanation — a better description and account of what constitute actual scientific reasoning and inferences.

And even worse — some economists using statistical methods think that algorithmic formalisms somehow give them access to causality. That is, however, simply not true. Assuming ‘convenient’ things like ‘faithfulness’ or ‘stability’ is to assume what has to be proven. Deductive-axiomatic methods used in statistics do no produce evidence for causal inferences. The real causality we are searching for is the one existing in the real world around us. If there is no warranted connection between axiomatically derived statistical theorems and the real-world, well, then we haven’t really obtained the causation we are looking for.

New Classical macroeconomists — people having their heads fuddled with nonsense

11 February, 2018 at 11:10 | Posted in Economics | 3 Comments

McNees documented the radical break between the 1960s and 1970s. The question is: what are the possible responses that economists and economics can make to those events?

robert_solow4One possible response is that of Professors Lucas and Sargent. They describe what happened in the 1970s in a very strong way with a polemical vocabulary reminiscent of Spiro Agnew. Let me quote some phrases that I culled from the paper: “wildly incorrect,” “fundamentally flawed,” “wreckage,” “failure,” “fatal,” “of no value,” “dire implications,” “failure on a grand scale,” spectacular recent failure,” “no hope” … I think that Professors Lucas and Sargent really seem to be serious in what they say, and in turn they have a proposal for constructive research that I find hard to talk about sympathetically. They call it equilibrium business cycle theory, and they say very firmly that it is based on two terribly important postulates — optimizing behavior and perpetual market clearing. When you read closely, they seem to regard the postulate of optimizing behavior as self-evident and the postulate of market-clearing behavior as essentially meaningless. I think they are too optimistic, since the one that they think is self-evident I regard as meaningless and the one that they think is meaningless, I regard as false. The assumption that everyone optimizes implies only weak and uninteresting consistency conditions on their behavior. Anything useful has to come from knowing what they optimize, and what constraints they perceive. Lucas and Sargent’s casual assumptions have no special claim to attention …

It is plain as the nose on my face that the labor market and many markets for produced goods do not clear in any meaningful sense. Professors Lucas and Sargent say after all there is no evidence that labor markets do not clear, just the unemployment survey. That seems to me to be evidence. Suppose an unemployed worker says to you “Yes, I would be glad to take a job like the one I have already proved I can do because I had it six months ago or three or four months ago. And I will be glad to work at exactly the same wage that is being paid to those exactly like myself who used to be working at that job and happen to be lucky enough still to be working at it.” Then I’m inclined to label that a case of excess supply of labor and I’m not inclined to make up an elaborate story of search or misinformation or anything of the sort. By the way I find the misinformation story another gross implausibility. I would like to see direct evidence that the unemployed are more misinformed than the employed, as I presume would have to be the case if everybody is on his or her supply curve of employment. Similarly, if the Chrysler Motor Corporation tells me that it would be happy to make and sell 1000 more automobiles this week at the going price if only it could find buyers for them, I am inclined to believe they are telling me that price exceeds marginal cost, or even that marginal revenue exceeds marginal cost, and regard that as a case of excess supply of automobiles. Now you could ask, why do not prices and wages erode and crumble under those circumstances? Why doesn’t the unemployed worker who told me “Yes, I would like to work, at the going wage, at the old job that my brother-in-law or my brother-in-law’s brother-in-law is still holding”, why doesn’t that person offer to work at that job for less? Indeed why doesn’t the employer try to encourage wage reduction? That doesn’t happen either. Why does the Chrysler Corporation not cut the price? Those are questions that I think an adult person might spend a lifetime studying. They are important and serious questions, but the notion that the excess supply is not there strikes me as utterly implausible.

Robert Solow

No unnecessary beating around the bush here.

The always eminently quotable Solow says it all.

The purported strength of New Classical macroeconomics is that it has firm anchorage in preference-based microeconomics, and especially the decisions taken by inter-temporal utility maximizing ‘forward-loooking’ individuals.

To some of us, however, this has come at too high a price. The almost quasi-religious insistence that macroeconomics has to have microfoundations – without ever presenting neither ontological nor epistemological justifications for this claim – has put a blind eye to the weakness of the whole enterprise of trying to depict a complex economy based on an all-embracing representative actor equipped with superhuman knowledge, forecasting abilities and forward-looking rational expectations. It is as if – after having swallowed the sour grapes of the Sonnenschein-Mantel-Debreu-theorem – these economists want to resurrect the omniscient Walrasian auctioneer in the form of all-knowing representative actors equipped with rational expectations and assumed to somehow know the true structure of our model of the world.

That anyone should take that kind of stuff seriously is totally and unbelievably ridiculous. Or as Solow has it:

4703325Suppose someone sits down where you are sitting right now and announces to me that he is Napoleon Bonaparte. The last thing I want to do with him is to get involved in a technical discussion of cavalry tactics at the battle of Austerlitz. If I do that, I’m getting tacitly drawn into the game that he is Napoleon. Now, Bob Lucas and Tom Sargent like nothing better than to get drawn into technical discussions, because then you have tacitly gone along with their fundamental assumptions; your attention is attracted away from the basic weakness of the whole story. Since I find that fundamental framework ludicrous, I respond by treating it as ludicrous – that is, by laughing at it – so as not to fall into the trap of taking it seriously and passing on to matters of technique.

Robert Solow

Hierarchical models​ and clustered residuals (student stuff)

10 February, 2018 at 16:05 | Posted in Statistics & Econometrics | Comments Off on Hierarchical models​ and clustered residuals (student stuff)

 

Exaggerated and unjustified statistical​ claims

9 February, 2018 at 23:07 | Posted in Statistics & Econometrics | Comments Off on Exaggerated and unjustified statistical​ claims

 

Next Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.