Mainstream economics and the perils of simplicity

10 Oct, 2015 at 17:48 | Posted in Economics | Comments Off on Mainstream economics and the perils of simplicity

Mainstream — neoclassical — economics has become increasingly irrelevant to the understanding of the real world. The main reason for this irrelevance is the failure of economists to match their deductive-axiomatic methods with their subject.

It is — sad to say — a fact that within mainstream economics internal validity is everything and external validity nothing. Why anyone should be interested in that kind of theories and models — as long as mainstream economists do not come up with any export licenses for their theories and models to the real world in which we live — is beyond my imagination. Sure, the simplicity that axiomatics and analytical arguments bring to economics is attractive to many economists. But …

aSimplicity, however, has its perils. It is one thing to choose as one’s first object of theoretical study the type of arguments open to analysis in the simplest terms. But it is quite another to treat this type of argument as a paradigm and to demand that arguments in other fields should conform to its standards regardless, or build up from a study of the simplest forms of argument alone a set of categories intended for application to arguments of all sorts: one must at any rate begin by inquiring carefully how far the artificial simplicity of one’s chosen modal results in these logical categories also being artificially simple. The sorts of risks one runs otherwise are obvious enough. Distinctions which all happen to cut along the same line for the simplest arguments may need to be handled quite separately in the general case; if we forget this, and our new found logical categories yield paradoxical results when applied to more complex arguments, we may be tempted to put these rules down to defects in the arguments instead of in our categories; and we may end up by thinking that, for some regrettable reason hidden deep in the nature of things, only our original, peculiarly simple arguments are capable of attaining to the ideal of validity.

Statistical significance testing — pseudo-intellectual garbage

10 Oct, 2015 at 09:56 | Posted in Statistics & Econometrics | Comments Off on Statistical significance testing — pseudo-intellectual garbage

Decisions based on statistical significance testing certainly make life easier. But significance testing doesn’t give us the knowledge we want. It only gives an answer to a question we as researchers never ask — what is the probability of getting the result we have got, assuming that there is no difference between two sets of data (e. g. control group – experimental group, sample – population). On answering the question we really are interested in — how probable and reliable is our hypothesis — it remains silent.

aapipesmokinOne wonders whether the function of statistical techniques in the social sciences is not primarily to provide a machinery for producing phoney corroborations and thereby a semblance of ‘scientific progress’ where, in fact, there is nothing but an increase in pseudo-intellectual garbage. …

Imre Lakatos

 
 
 

Economic convergence and the Markov chain approach

9 Oct, 2015 at 09:57 | Posted in Economics, Statistics & Econometrics | 2 Comments

In the Markov chain approach to income convergence … the “law of motion” driving the evolution of the income distribution is assumed memory-less and time invariant. Upon having estimated probabilities of moving up or down the income hierarchy during a transition period of a given length, the law is used to calculate a limiting income distribution characterizing a stochastic steady-state income distribution to which the system converges over time. weakest-linkAlthough several authors emphasize that the limiting distribution merely represents a thought experiment, this distribution is necessary to clarify the direction of the evolution. The estimated transition probability matrix in itself is often rather uninformative with respect to the evolution of the income distribution. Unlike the convergence-regression approach, however, the reliability of the estimated transition probabilities and hence of the limiting income distribution has rarely been questioned. In most empirical studies, the statistical assumptions underlying the Markov chain approach have been taken for granted, although they are quite restrictive …

As an illustration, the article shows that the evolution of the income distribution across the forty-eight contiguous U.S. states from 1929 to 2000 does clearly not follow a common first-order stationary Markov process for various reasons. First, there is a structural break in the aftermath ofWorldWar II that significantly affects the evolution of the income distribution; another structural break may have occurred in the late 1990s. Second, certain groups of states … show a development that is different from other states … Third, states with poor neighbors show a different development than states with rich neighbors. Moreover, a choice for annual transition periods is shown to be inconsistent with the Markov property. Ignoring these factors may considerably affect the correctness of inferences about the evolution of the regional income distribution drawn from the limiting distribution.

Frank Bickenbach & Eckhardt Bode

Nobel prize and self-righteous Chicago drivel

8 Oct, 2015 at 21:51 | Posted in Economics | 1 Comment

In 2007 Thomas Sargent gave a graduation speech at University of California at Berkeley, giving the grads “a short list of valuable lessons that our beautiful subject teaches”:

1. Many things that are desirable are not feasible.
2. Individuals and communities face trade-offs.
3. Other people have more information about their abilities, their efforts, and their preferences than you do.
4. Everyone responds to incentives, including people you want to help. That is why social safety nets don’t always end up working as intended.
5. There are trade offs between equality and efficiency.
6. In an equilibrium of a game or an economy, people are satisfied with their choices. That is why it is difficult for well meaning outsiders to change things for better or worse.
Lebowski.jpg-610x07. In the future, you too will respond to incentives. That is why there are some promises that you’d like to make but can’t. No one will believe those promises because they know that later it will not be in your interest to deliver. The lesson here is this: before you make a promise, think about whether you will want to keep it if and when your circumstances change. This is how you earn a reputation.
8. Governments and voters respond to incentives too. That is why governments sometimes default on loans and other promises that they have made.
9. It is feasible for one generation to shift costs to subsequent ones. That is what national government debts and the U.S. social security system do (but not the social security system of Singapore).
10. When a government spends, its citizens eventually pay, either today or tomorrow, either through explicit taxes or implicit ones like inflation.
11. Most people want other people to pay for public goods and government transfers (especially transfers to themselves).
12. Because market prices aggregate traders’ information, it is difficult to forecast stock prices and interest rates and exchange rates.

Reading through this list of “valuable lessons” things suddenly fall in place.

This kind of self-righteous neoliberal drivel has again and again been praised and prized. And not only by econ bloggers and right-wing think-tanks.

Out of the seventy five laureates that have been awarded “The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel,” twenty eight have been affiliated to The University of Chicago — that is 37 %. The world is really a small place when it comes to economics …

Bostadsbubblan på väg spricka!

6 Oct, 2015 at 11:22 | Posted in Economics | 3 Comments

Konjunkturinstitutets generaldirektör Mats Dillén gick igår ut och varnade för en påtaglig risk för att bostadspriserna ligger på en ohållbart hög nivå. De höga bostadspriserna går inte att förklara med ökade inkomster, låga räntor och skattelättnader. Med en årlig prisökningstakten på 10-15 procent är vi uppenbarligen på väg mot en situation där bostadsbubblan spricker och en ny finansiell och ekonomisk krasch är ett faktum.

Det är glädjande att Konjunkturinstitutet nu äntligen vågar tala klarspråk om den svenska bobubblan. Vi är ju några ekonomer som i ett par års tid försökt lyfta den här frågan och pekat på farorna. För det mesta har vi mötts med den vanliga dryga etablissemangs-tystnaden. Antagligen blir det svårare att vifta bort KI:s synpunkter.

De svenska hushållen tror på en fortsatt uppgång för bostadspriserna. Det har alla boprisindikator kontinuerligt visat under flera år nu. Trots finanskris och oro över euron så kommer detta troligen att innebära att svenskarna fortsätter att låna mer och mer för att köpa bostäder – och ytterligare spär på en redan astronomiskt hög skuldvolym.

Varje sansad bedömare inser att detta är problem som måste lösas innan bostadsbubblan spricker. I annat fall är det hög risk för att låt-gå-politiken kommer att gå igen som svinhugg – och då är det de arbetslösa, bostadslösa och skuldsatta som får ta smällarna.

Hushållens skuldsättning bottnar främst i den ökning av tillgångsvärden som letts av ökad långivning till hushållen och den därav uppkomna bostadsbubblan. På lång sikt är det självklart inte möjligt att bibehålla denna trend. Tillgångspriserna avspeglar i grunden förväntningar om framtida avkastning på investeringar. Om tillgångspriserna fortsätter öka snabbare än inkomsterna blir effekten ökad inflation med vidhängande nedjustering av tillgångarnas realvärde.

Den reala egeninsatsen vid husköp är i Sverige extremt låg. Det innebär att hävstångseffekterna blir desto större. Och Riksbankens försök att “pimpa” ekonomin med sänkta räntor har lett till ännu högre bostadspriser och ännu högre hävstångseffekter. Och ännu större bostadsbubbla.

Anta att du har en summa pengar (A) som du vill investera. Du kan antingen sätta in pengarna på banken och få en årlig ränta (r) på pengarna. Om vi för enkelhetens skull bortser från eventuella risker, tillgångsdepreciering och transaktionskostnader ska rA motsvara den inkomst jag alternativt skulle kunna erhålla om jag istället köper ett hus med en insats på A och hyr ut mitt hus under ett år till en hyra på h plus förändringen i huspriset (dhp), dvs:

(1) rA = h + dhp

Dividerar vi båda sidor med huspriset (hp) får vi

(2) hp = h/[r(A/hp) – (dhp/hp)]

Om du förväntar dig att huspriserna (hp) ska stiga, så kommer priset på huset att stiga. Den här typen av självgenererande kumulativ process är kärnan i själva husbubblan. Till skillnad från vanliga varumarknader där efterfrågekurvan i regel är nedåtlutande, har vi på tillgångsmarknader ofta en efterfrågekurva som är uppåtlutande och därigenom ger upphov till den här typen av icke-stabilitet. Notera också att ju större hävstångseffekter vi har (ju lägre A/hp), ju större blir prisstegringen.

Med en egeninsats på 20 kan du köpa ett hus som är värt 100. Hävstången är 5 och om priset på bostaden stiger 10% ökar dina tillgångar med 50% – men om priset på bostaden sjunker med 10% minskar å andra sidan dina tillgångar med 50%. När tillgångspriser väl börjar sjunka ger de höga hävstängerna upphov till en kumulativ prissänkningsprocess där ökad osäkerhet och volatilitet snabbt kan leda till ökade krav på säkerheter och egeninsatser. Bostadsbubblan spricker och krisen står för dörren.

Vad bör göras? Den amerikanske Riksbankschefen under 1950- och 1960-talen – William McChesney Martin – lär skämtsamt ha sagt att en centralbanks främsta uppgift är att se till att ta bort bålskålen när festen börjar komma igång. Festen har pågått alldeles för länge redan. För att undvika en bokrasch motsvarande den i USA för åtta år sedan krävs krafttag av riksbank och regering. Att som vissa mäklare och banker med “vested interests” bagatellisera bobubblan förvärrar bara smällen när den väl kommer.

Med den skuldkvot vi ser hushållen tagit på sig riskerar vi få en skulddeflationskris som kommer att slå oerhört hårt mot svenska hushåll.

Det är djupt bekymmersamt att svenska hushåll är beredda att ta på sig så stora lån som man gör idag. Det är hög tid att den nästan exponentiella skuldsättningsutvecklingen bromsas. I annat fall kan drömmen om ett eget boende mycket väl visa sig bli en mardröm.

År 1638 kunde priset på en tulpanlök i Nederländerna vara så högt att det motsvarade två årslöner. Och hushållen var övertygade om att priserna bara skulle fortsätta att öka och öka. Som alla andra bubblor sprack dock även denna bubbla – ”tulpanmanin” – och lämnade mängder av utblottade och ruinerade människor efter sig. Liknande ting utspelade sig i exempelvis Mississippibubblan år 1720 och i IT-bubblan för femton år sedan. Hur svårt ska det vara att lära av historien?

Det är minst sagt skrämmande att bostadsbubblan bara fortsätter att växa. När den väl spricker – vilket den gör – blir krisen desto värre.

Touching the void (private)

6 Oct, 2015 at 10:13 | Posted in Varia | Comments Off on Touching the void (private)


Forget Hollywood and Everest.
This is the one mountaineering film you just have to watch.
Absolutely fabulous!

Austrian critique of mainstream economics

5 Oct, 2015 at 22:57 | Posted in Economics | 3 Comments

 

Great video — after listening to Walter Block you will never for a second consider Austrian economics to be a feasible alternative to mainstream economics …

Added October 08: Lord Keynes has an interesting follow-up here.

Redistributive economic policies and the limited applicability of Markov model thinking

5 Oct, 2015 at 15:40 | Posted in Economics | Comments Off on Redistributive economic policies and the limited applicability of Markov model thinking

 

On the dynamics of wealth inequality

5 Oct, 2015 at 11:19 | Posted in Economics | Comments Off on On the dynamics of wealth inequality

Over the last three decades, Atkinson et al. (2011) find that there has been an increase in the concentration of income in many countries while Wolff (2010) describes a similar though smaller increase in the concentration of wealth in the United States.

the-truth-about-wealth-inequalityMotivated by these stylized facts, we develop a model in which infinitely lived households face idiosyncratic investment risk, and we examine the dynamic behavior of the distribution of wealth over time. Our goal is to explore these dynamics in the absence of any redistributive mechanisms, so that the outcome of the model is affected only by households׳ optimal decisions about how much to consume or save and their realized labor and investment incomes. Because we assume that all households are equally patient and have identical abilities, it is luck alone—in the form of high realized investment returns—that creates diverging levels of wealth. In this setting, we show that the equilibrium distribution of wealth is not stationary, and, using recent results in mathematical finance and stochastic portfolio theory, we prove that it becomes increasingly right-skewed over time and tends to a limit in which wealth is concentrated entirely at the top …

The main conclusion of our analysis is clear. In the absence of any redistribution, the distribution of wealth is unstable over time and grows increasingly right-skewed until virtually all wealth is concentrated with a single household. This occurs despite the fact that the households in the economy have identical opportunities and identical preferences and abilities. It is important to emphasize that our setup in this paper, in which there is absolutely no redistribution, is intended to describe an important benchmark case rather than to capture the true state of the World … In reality, a number of potentially redistributive mechanisms, such as government tax and fiscal policies and limited intergenerational transfers, constantly affect the economy and influence the extent of concentration of wealth at the top. Indeed, our conclusions highlight the importance of these redistributive mechanisms, since it is their presence alone that ensures the stability of the economy and prevents an outcome in which the distribution of wealth is non-stationary and grows increasingly right-skewed over time.

Ricardo Fernholz & Robert Fernholz

Given their rather arbitrary distributional assumptions, this kind of inequality dynamics models can basically never be anything but “thought experiments” — but they are still interesting because they show that even if people were the same in terms of efficiency, skilfulness, capability, etc., pure luck (randomness) may create a very unequal society where almost all wealth is concentrated to those at the top. This of course also gives an extra rational for building a society with strong redistributive institutions and mechanisms. Without a conscious effort to counteract the inevitable forces driving our societies towards an extreme income and wealth inequality, our societies crackle. It is crucial to have strong redistributive policies if we want to have stable economies and societies. Redistributive taxes and active fiscal policies are necessary ingredients for building a  good society.

What we see happen in the US, the UK, Sweden, and elsewhere, is deeply disturbing. The rising inequality is outrageous – not the least since it has to a large extent to do with income and wealth increasingly being concentrated in the hands of a very small and privileged elite.

Societies where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implode. The cement that keeps us together erodes and in the end we are only left with people dipped in the ice cold water of egoism and greed. It’s high time to put an end to this the worst Juggernaut of our time!

Macroeconomic causality

4 Oct, 2015 at 21:02 | Posted in Economics | Comments Off on Macroeconomic causality

The Greek word “empiric” refers to ancient physicians who based their medical advice on experience, not theory. Medieval empirics came to the conclusion that blood-letting caused improvements in health because the health of the patients often improved after the blood was let. But we know now that temporal orderings do not imply causation, even though we give Nobel prizes [Clive Granger] to folks who use temporal orderings to infer causation. Just to make sure, we call it a fallacy and express it in Latin: Post Hoc Ergo Propter Hoc: After that, because of that.

41xUzvOt-TL._SX311_BO1,204,203,200_For scientifically valid causal inferences, we need an experiment; we need a control group and a treated group. Then the difference between the outcome for the treated group and the outcome for the control group is a measure of the effect of the treatment.

In the area of macroeconomics, experiments are hard to come by. What we have are only temporal orderings: first this and then that.

With only temporal orderings and no experimental evidence, we do what humans do: we rely on stories. To each temporal ordering, we attach a predictive narrative or a causal narrative or both. We draw firm causal conclusions from the temporal orderings when the causal narrative is compelling and when there is no equally compelling predictive narrative. This is literature and wisdom, not science …

So there you have it: Its faith-based decision making, which is very much influenced by the rhetorical skills of the advocates.

Probability and economics (wonkish)

3 Oct, 2015 at 08:59 | Posted in Economics, Statistics & Econometrics | 2 Comments

Modern neoclassical economics relies to a large degree on the notion of probability.

To at all be amenable to applied economic analysis, economic observations allegedly have to be conceived as random events that are analyzable within a probabilistic framework.

But is it really necessary to model the economic system as a system where randomness can only be analyzed and understood when based on an a priori notion of probability?

probabilityWhen attempting to convince us of the necessity of founding empirical economic analysis on probability models, neoclassical economics actually forces us to (implicitly) interpret events as random variables generated by an underlying probability density function.

This is at odds with reality. Randomness obviously is a fact of the real world. Probability, on the other hand, attaches (if at all) to the world via intellectually constructed models, and a fortiori is only a fact of a probability generating (nomological) machine or a well constructed experimental arrangement or “chance set-up”.

In probabilistic econometrics randomness is often defined with the help of independent trials – two events are said to be independent if the occurrence or nonoccurrence of either one has no effect on the probability of the occurrence of the other – as drawing cards from a deck, picking balls from an urn, spinning a roulette wheel or tossing coins – trials which are only definable if somehow set in a probabilistic context.

But if we pick a sequence of prices – say 2, 4, 3, 8, 5, 6, 6 – that we want to use in an econometric regression analysis, how do we know the sequence of prices is random and a fortiori being able to treat as generated by an underlying probability density function? How can we argue that the sequence is a sequence of probabilistically independent random prices? And are they really random in the sense that is most often applied in probabilistic econometrics – where X is called a random variable only if there is a sample space S with a probability measure and X is a real-valued function over the elements of S?

Bypassing the scientific challenge of going from describable randomness to calculable probability by just assuming it, is of course not an acceptable procedure. Since a probability density function is a “Gedanken” object that does not exist in a natural sense, it has to come with an export license to our real target system if it is to be considered usable. We still have to show that the real sequence somehow coincides with the ideal sequence that defines independence and randomness within our “nomological machine,” our “probabilistic model.”

Just as there is no such thing as a “free lunch,” there is no such thing as a “free probability.” To be able at all to talk about probabilities, you have to specify a model. If there is no chance set-up or model that generates the probabilistic outcomes or events – in statistics one refers to any process where you observe or measure as an experiment (rolling a die) and the results obtained as the outcomes or events (number of points rolled with the die, being e. g. 3 or 5) of the experiment – there strictly seen is no event at all.

Probability is a relational element. It always must come with a specification of the model from which it is calculated. And then to be of any empirical scientific value it has to be shown to coincide with (or at least converge to) real data generating processes or structures – something seldom or never done!

And this is the basic problem with economic data. If you have a fair roulette-wheel, you can arguably specify probabilities and probability density distributions. But how do you conceive of the analogous nomological machines for prices, gross domestic product, income distribution etc? Only by a leap of faith. And that does not suffice. You have to come up with some really good arguments if you want to persuade people into believing in the existence of socio-economic structures that generate data with characteristics conceivable as stochastic events portrayed by probabilistic density distributions!

From a realistic point of view we really have to admit that the socio-economic states of nature that we talk of in most social sciences – and certainly in economics – are not amenable to analyze as probabilities, simply because in the real world open systems that social sciences – including economics – analyze, there are no probabilities to be had!

The processes that generate socio-economic data in the real world cannot just be assumed to always be adequately captured by a probability measure. And, so, it cannot really be maintained that it even should be mandatory to treat observations and data – whether cross-section, time series or panel data – as events generated by some probability model. The important activities of most economic agents do not usually include throwing dice or spinning roulette-wheels. Data generating processes – at least outside of nomological machines like dice and roulette-wheels – are not self-evidently best modeled with probability measures.

If we agree on this, we also have to admit that much of modern neoclassical economics lacks a sound justification. I would even go further and argue that there really is no justifiable rationale at all for this belief that all economically relevant data can be adequately captured by a probability measure. In most real world contexts one has to argue and justify one’s case. And that is obviously something seldom or never done by practitioners of neoclassical economics.

As David Salsburg (2001:146) notes on probability theory:

[W]e assume there is an abstract space of elementary things called ‘events’ … If a measure on the abstract space of events fulfills certain axioms, then it is a probability. To use probability in real life, we have to identify this space of events and do so with sufficient specificity to allow us to actually calculate probability measurements on that space … Unless we can identify [this] abstract space, the probability statements that emerge from statistical analyses will have many different and sometimes contrary meanings.

Just as e. g. John Maynard Keynes (1921) and Nicholas Georgescu-Roegen (1971), Salsburg (2001:301f) is very critical of the way social scientists – including economists and econometricians – uncritically and without arguments have come to simply assume that one can apply probability distributions from statistical theory on their own area of research:

DavidSalsburgProbability is a measure of sets in an abstract space of events. All the mathematical properties of probability can be derived from this definition. When we wish to apply probability to real life, we need to identify that abstract space of events for the particular problem at hand … It is not well established when statistical methods are used for observational studies … If we cannot identify the space of events that generate the probabilities being calculated, then one model is no more valid than another … As statistical models are used more and more for observational studies to assist in social decisions by government and advocacy groups, this fundamental failure to be able to derive probabilities without ambiguity will cast doubt on the usefulness of these methods.

Or as the great British mathematician John Edensor Littlewood says in his A Mathematician’s Miscellany:

LittlewoodMathematics (by which I shall mean pure mathematics) has no grip on the real world ; if probability is to deal with the real world it must contain elements outside mathematics ; the meaning of ‘ probability ‘ must relate to the real world, and there must be one or more ‘primitive’ propositions about the real world, from which we can then proceed deductively (i.e. mathematically). We will suppose (as we may by lumping several primitive propositions together) that there is just one primitive proposition, the ‘probability axiom’, and we will call it A for short. Although it has got to be true, A is by the nature of the case incapable of deductive proof, for the sufficient reason that it is about the real world …

We will begin with the … school which I will call philosophical. This attacks directly the ‘real’ probability problem; what are the axiom A and the meaning of ‘probability’ to be, and how can we justify A? It will be instructive to consider the attempt called the ‘frequency theory’. It is natural to believe that if (with the natural reservations) an act like throwing a die is repeated n times the proportion of 6’s will, with certainty, tend to a limit, p say, as n goes to infinity … If we take this proposition as ‘A’ we can at least settle off-hand the other problem, of the meaning of probability; we define its measure for the event in question to be the number p. But for the rest this A takes us nowhere. Suppose we throw 1000 times and wish to know what to expect. Is 1000 large enough for the convergence to have got under way, and how far? A does not say. We have, then, to add to it something about the rate of convergence. Now an A cannot assert a certainty about a particular number n of throws, such as ‘the proportion of 6’s will certainly be within p +- e for large enough n (the largeness depending on e)’. It can only say ‘the proportion will lie between p +- e with at least such and such probability (depending on e and n*) whenever n>n*’. The vicious circle is apparent. We have not merely failed to justify a workable A; we have failed even to state one which would work if its truth were granted. It is generally agreed that the frequency theory won’t work. But whatever the theory it is clear that the vicious circle is very deep-seated: certainty being impossible, whatever A is made to state can only be in terms of ‘probability ‘.

This importantly also means that if you cannot show that data satisfies all the conditions of the probabilistic nomological machine, then the statistical inferences used – and a fortiori neoclassical economics – lack sound foundations!

 

References

Georgescu-Roegen, Nicholas (1971), The Entropy Law and the Economic Process. Harvard University Press.

Keynes, John Maynard (1973 (1921)), A Treatise on Probability. Volume VIII of The Collected Writings of John Maynard Keynes, London: Macmillan.

Littlewood, John Edensor (1953) A Mathematician’s Miscellany, London: Methuen & Co.

Salsburg, David (2001), The Lady Tasting Tea. Henry Holt.

Look who’s lecturing who!

2 Oct, 2015 at 18:06 | Posted in Economics | 3 Comments

 

A must-see!

Rethinking economics student vs. mainstream economics professor: 10–0.

What is wrong with economists’ modelling?

2 Oct, 2015 at 09:59 | Posted in Economics | 1 Comment

Why do I suppose that mathematical deductivist modelling of the sort pursued by economists is a problem in itself? … My answer, simply put, can be expressed in the following three propositions:

(i) The sorts of mathematical deductivist methods that economists use are, like all research methods, types of tools.

(ii) All tools are appropriate to dealing with but a limited set of tasks, involving a limited set of phenomena, in a limited set of contexts, and not others.

(iii) The nature and conditions of social reality are such that the forms of mathematical deductivist reasoning favoured by modern economists are almost entirely inadequate as tools of insightful social analysis.

aseI doubt that many would suggest that we seek to use pencils to cut hedges, telephones to dig garden, forks to fly us to other countries. Yet pencils, telephones and forks can be very useful to us in certain contexts, with respect to very specific tasks and phenomena.

Tony Lawson

Modern economics has become increasingly irrelevant to the understanding of the real world. The main reason for this irrelevance is the failure of economists to match their deductive-axiomatic methods with their subject.

It is — sad to say — a fact that within mainstream economics internal validity is everything and external validity nothing. Why anyone should be interested in that kind of theories and models is beyond my imagination. As long as mainstream economists do not come up with any export licenses for their theories and models to the real world in which we live, they really should not be surprised if people say that this is not science, but autism.

Studying mathematics and logics is interesting and fun. It sharpens the mind. In pure mathematics and logics we do not have to worry about external validity. But economics is not pure mathematics or logics. It’s about society. The real world. Forgetting that, economics is really in danger of becoming — as John Maynard Keynes put it in a letter to Ragnar Frisch in 1935 — “nothing better than a contraption proceeding from premises which are not stated with precision to conclusions which have no clear application.”

Still on top after 50 (private)

1 Oct, 2015 at 21:33 | Posted in Varia | 2 Comments

 

My dear parents took me to the cinemas to watch this awesome musical back in 1966.
I was eight years old — and of course I fell hopelessly in love with Maria.
I think I still am …

« Previous Page

Blog at WordPress.com.
Entries and Comments feeds.

%d bloggers like this: