Non-ergodicity and time irreversibility (wonkish)

30 June, 2013 at 19:45 | Posted in Economics, Statistics & Econometrics | 1 Comment

As yours truly has argued – e.g. here, here and here – time irreversibility and non-ergodicity are  extremely important issues for understanding what are the deep fundamental flaws of mainstream neoclassical economics in general.

Ole Peters presentation at Gresham College gives further  evidence why expectation values are irrelevant for understanding economic systems in specific.

Gesell och Keynes om räntans bestämningsfaktorer

30 June, 2013 at 12:12 | Posted in Economics | Leave a comment

I Silvio Gesells Die natürliche Wirtschaftsordnung (1916) pekar författaren på pengarnas benägenhet att sänka kostnader för varuutbyte (vad som i moderna termer kan kallas deras förmåga att sänka transaktionskostnader). Emellertid gäller också att pengar till skillnad från varor är lätta att lagra eftersom de inte ”rostar”. De fungerar alltså inte bara som bytesmedel, utan också som värdebevarare.

Pengarnas användbarhet och smidighet leder till en efterfrågan att disponera dem, vilket är orsaken till att det existerar ränta. Gesell menar att bruttoräntan innehåller en riskpremie för utlånaren och att det också finns en ”hausse-premie” som ska ersätta den spekulationsvinst (genom att köpa och därefter sälja varaktiga varor) som utlånaren anser sig gå miste om genom att i stället låna ut sina pengar.

Tar man bort dessa båda pålägg, återstår vad Gesell kallar ”urräntan”. Den motsvarar i huvudsak vad som annars kallas ren ränta eller nettoränta. Gesell menar sig ha historiskt stöd för att urräntan har varit ungefär densamma sedan pengarnas uppkomst, nämligen 3-5 procent per år. Den grundar sig nämligen på pengarnas inneboende fördelar för innehavaren.

Räntan – som liksom hos John Maynard Keynes främst ses som en betalning för att motverka tendensen att spara i madrassen – är inte Gesells primära måltavla. Den ses i stället som ett symptom på det som enligt Gesell är grundproblemet – pengarnas funktion som kapital. Räntan är emellertid ett problem därför att den styr inkomstflöden till dem som har pengar. Detta anses dels vara orättvist, dels leda till att man inte får någon riktig avstämning mellan produktion och konsumtion på varumarknaden. Personer som lever på ränteinkomster har nämligen en överskottslikviditet (överskott på pengar) som inte direkt används till varuinköp. Detta leder till en osäker avsättning för varulagren och därigenom till kriser med sjunkande varupriser och arbetslöshet.
Read more …

Envoi

29 June, 2013 at 10:46 | Posted in Varia | Leave a comment

1241237_316_400Som mareld tindrar en stjärna, släcks och tänds
och släcks och tänds igen. De dallrande djupen bär den
Så har jag stått vid hundrade Land´s Ends
och tänkt på vad jag vill och vad jag skall i världen

Det ena vore väl: att vara som man är
Det andra är väl: att mot udden spjärna
Och hade jag den skarpa udden mindre kär
så vore jag som andra, mer än gärna

Somligas väsen är: vara. Andras: att vara förutan
Vägar har inget mål. Det är stigar som leder dit
Såg du ett fönster lysa? Tänkte du knacka på rutan?
Din är en månskensväg, som slingrar i dyningen, vit.

(På youtube går det att återuppleva Tone Bengtssons underbart fina tv-porträtt av Gunnar Ekelöf från 1995: http://youtu.be/NwEJCvniooA
h/t Jan Milch)

Read my lips – significance testing is no substitute for doing real science!

28 June, 2013 at 12:26 | Posted in Statistics & Econometrics | Leave a comment

Jager and Leek may well be correct in their larger point, that the medical literature is broadly correct. But I don’t think the statistical framework they are using is appropriate for the questions they are asking. My biggest problem is the identification of scientific hypotheses and statistical “hypotheses” of the “theta = 0″ variety.

Based on the word “empirical” title, I thought the authors were going to look at a large number of papers with p-values and then follow up and see if the claims were replicated. But no, they don’t follow up on the studies at all! What they seem to be doing is collecting a set of published p-values and then fitting a mixture model to this distribution, a mixture of a uniform distribution (for null effects) and a beta distribution (for non-null effects). Since only statistically significant p-values are typically reported, they fit their model restricted to p-values less than 0.05. But this all assumes that the p-values have this stated distribution. You don’t have to be Uri Simonsohn to know that there’s a lot of p-hacking going on. Also, as noted above, the problem isn’t really effects that are exactly zero, the problem is that a lot of effects are lots in the noise and are essentially undetectable given the way they are studied.

Jager and Leek write that their model is commonly used to study hypotheses in genetics and imaging. I could see how this model could make sense in those fields … but I don’t see this model applying to published medical research, for two reasons. First … I don’t think there would be a sharp division between null and non-null effects; and, second, there’s just too much selection going on for me to believe that the conditional distributions of the p-values would be anything like the theoretical distributions suggested by Neyman-Pearson theory.

So, no, I don’t at all believe Jager and Leek when they write, “we are able to empirically estimate the rate of false positives in the medical literature and trends in false positive rates over time.” They’re doing this by basically assuming the model that is being questioned, the textbook model in which effects are pure and in which there is no p-hacking.

Andrew Gelman

Indeed. If anything, this underlines how important it is not to equate science with statistical calculations. All science entail human judgement, and using statistical models doesn’t relieve us of that necessity. Working with misspecified models, the scientific value of significance testing is actually zero – even though you’re making valid statistical inferences! Statistical models and concomitant significance tests are no substitutes for doing real science. Or as a noted German philosopher once famously wrote:

There is no royal road to science, and only those who do not dread the fatiguing climb of its steep paths have a chance of gaining its luminous summits.

readmylips_large

De rättfärdiga

28 June, 2013 at 11:06 | Posted in Varia | 1 Comment

200px-Nils_FerlinHär stå de nu, pampiga, värdiga,
mens marknadsfiolerna skorra –
De äro de tio rättfärdiga
som söktes en gång i Gomorra.

Alls inte så goda att rå på;
de lyfter sej själva med orden.
Och hade de något att stå på
så lyfte de hela jorden.

Förunderligt friska i hjärnorna;
människoslukare.
Talar om solen och stjärnorna
för åkerbrukare.

Sen bärgar de sitt på det torra,
pampiga, värdiga –
Pampiga tio rättfärdiga
som förstenade Gud och Gomorra.
 
Nils Ferlin: Tio rättfärdiga

Economics and probability

27 June, 2013 at 13:52 | Posted in Economics, Statistics & Econometrics | 1 Comment

Modern neoclassical economics relies to a large degree on the notion of probability.

To at all be amenable to applied economic analysis, economic observations allegedly have to be conceived as random events that are analyzable within a probabilistic framework.

But is it really necessary to model the economic system as a system where randomness can only be analyzed and understood when based on an a priori notion of probability?

probabilityWhen attempting to convince us of the necessity of founding empirical economic analysis on probability models, neoclassical economics actually forces us to (implicitly) interpret events as random variables generated by an underlying probability density function.

This is at odds with reality. Randomness obviously is a fact of the real world. Probability, on the other hand, attaches (if at all) to the world via intellectually constructed models, and a fortiori is only a fact of a probability generating (nomological) machine or a well constructed experimental arrangement or “chance set-up”.

Just as there is no such thing as a “free lunch,” there is no such thing as a “free probability.” To be able at all to talk about probabilities, you have to specify a model. If there is no chance set-up or model that generates the probabilistic outcomes or events – in statistics one refers to any process where you observe or measure as an experiment (rolling a die) and the results obtained as the outcomes or events (number of points rolled with the die, being e. g. 3 or 5) of the experiment – there strictly seen is no event at all.

Probability is a relational element. It always must come with a specification of the model from which it is calculated. And then to be of any empirical scientific value it has to be shown to coincide with (or at least converge to) real data generating processes or structures – something seldom or never done!

And this is the basic problem with economic data. If you have a fair roulette-wheel, you can arguably specify probabilities and probability density distributions. But how do you conceive of the analogous nomological machines for prices, gross domestic product, income distribution etc? Only by a leap of faith. And that does not suffice. You have to come up with some really good arguments if you want to persuade people into believing in the existence of socio-economic structures that generate data with characteristics conceivable as stochastic events portrayed by probabilistic density distributions!

From a realistic point of view we really have to admit that the socio-economic states of nature that we talk of in most social sciences – and certainly in economics – are not amenable to analyze as probabilities, simply because in the real world open systems that social sciences – including economics – analyze, there are no probabilities to be had!

The processes that generate socio-economic data in the real world cannot just be assumed to always be adequately captured by a probability measure. And, so, it cannot really be maintained that it even should be mandatory to treat observations and data – whether cross-section, time series or panel data – as events generated by some probability model. The important activities of most economic agents do not usually include throwing dice or spinning roulette-wheels. Data generating processes – at least outside of nomological machines like dice and roulette-wheels – are not self-evidently best modeled with probability measures.

If we agree on this, we also have to admit that much of modern neoclassical economics lacks a sound justification. I would even go further and argue that there really is no justifiable rationale at all for this belief that all economically relevant data can be adequately captured by a probability measure. In most real world contexts one has to argue and justify one’s case. And that is obviously something seldom or never done by practitioners of neoclassical economics.

As David Salsburg (2001:146) notes on probability theory:

[W]e assume there is an abstract space of elementary things called ‘events’ … If a measure on the abstract space of events fulfills certain axioms, then it is a probability. To use probability in real life, we have to identify this space of events and do so with sufficient specificity to allow us to actually calculate probability measurements on that space … Unless we can identify [this] abstract space, the probability statements that emerge from statistical analyses will have many different and sometimes contrary meanings.

Just as e. g. John Maynard Keynes (1921) and Nicholas Georgescu-Roegen (1971), Salsburg (2001:301f) is very critical of the way social scientists – including economists and econometricians – uncritically and without arguments have come to simply assume that one can apply probability distributions from statistical theory on their own area of research:

DavidSalsburgProbability is a measure of sets in an abstract space of events. All the mathematical properties of probability can be derived from this definition. When we wish to apply probability to real life, we need to identify that abstract space of events for the particular problem at hand … It is not well established when statistical methods are used for observational studies … If we cannot identify the space of events that generate the probabilities being calculated, then one model is no more valid than another … As statistical models are used more and more for observational studies to assist in social decisions by government and advocacy groups, this fundamental failure to be able to derive probabilities without ambiguity will cast doubt on the usefulness of these methods.

Or as the great British mathematician John Edensor Littlewood says in his A Mathematician’s Miscellany:

LittlewoodMathematics (by which I shall mean pure mathematics) has no grip on the real world ; if probability is to deal with the real world it must contain elements outside mathematics ; the meaning of ‘ probability ‘ must relate to the real world, and there must be one or more ‘primitive’ propositions about the real world, from which we can then proceed deductively (i.e. mathematically). We will suppose (as we may by lumping several primitive propositions together) that there is just one primitive proposition, the ‘probability axiom’, and we will call it A for short. Although it has got to be true, A is by the nature of the case incapable of deductive proof, for the sufficient reason that it is about the real world …

We will begin with the … school which I will call philosophical. This attacks directly the ‘real’ probability problem; what are the axiom A and the meaning of ‘probability’ to be, and how can we justify A? It will be instructive to consider the attempt called the ‘frequency theory’. It is natural to believe that if (with the natural reservations) an act like throwing a die is repeated n times the proportion of 6’s will, with certainty, tend to a limit, p say, as n goes to infinity … If we take this proposition as ‘A’ we can at least settle off-hand the other problem, of the meaning of probability; we define its measure for the event in question to be the number p. But for the rest this A takes us nowhere. Suppose we throw 1000 times and wish to know what to expect. Is 1000 large enough for the convergence to have got under way, and how far? A does not say. We have, then, to add to it something about the rate of convergence. Now an A cannot assert a certainty about a particular number n of throws, such as ‘the proportion of 6’s will certainly be within p +- e for large enough n (the largeness depending on e)’. It can only say ‘the proportion will lie between p +- e with at least such and such probability (depending on e and n*) whenever n>n*’. The vicious circle is apparent. We have not merely failed to justify a workable A; we have failed even to state one which would work if its truth were granted. It is generally agreed that the frequency theory won’t work. But whatever the theory it is clear that the vicious circle is very deep-seated: certainty being impossible, whatever A is made to state can only be in terms of ‘probability ‘.

This importantly also means that if you cannot show that data satisfies all the conditions of the probabilistic nomological machine, then the statistical inferences used – and a fortiori neoclassical economics – lack sound foundations!

 
References

Georgescu-Roegen, Nicholas (1971), The Entropy Law and the Economic Process. Harvard University Press.

Keynes, John Maynard (1973 (1921)), A Treatise on Probability. Volume VIII of The Collected Writings of John Maynard Keynes, London: Macmillan.

Littlewood, John Edensor (1953) A Mathematician’s Miscellany, London: Methuen & Co.

Salsburg, David (2001), The Lady Tasting Tea. Henry Holt.

Listen to Larry, Greg!

25 June, 2013 at 10:43 | Posted in Politics & Society | Leave a comment


Lawrence Summers listening to Greg Mankiw’s explications on the one percent?

Even though the interest may not be reciprocated,  it would obviously be a good idea for Greg Mankiw to listen to his Harvard colleague Lawrence Summers, instead of trivializing the problems created by increasing inequality! Summers has some interesting  thoughts on why income inequality is on the rise and what to do about it:

Why has the top 1 per cent of the population done so well relative to the rest? The answer probably lies substantially in changing technology and globalisation. When George Eastman revolutionised photography, he did very well and, because he needed a large number of Americans to carry out his vision, the city of Rochester had a thriving middle class for two generations. By contrast, when Steve Jobs revolutionised personal computing, he and the shareholders in Apple (who are spread all over the world) did very well but a much smaller benefit flowed to middle-class American workers both because production was outsourced and because the production of computers and software was not terribly labour intensive …

What then is the right response to rising inequality? There are too few good ideas in current political discourse and the development of better ones is crucial. Here are three.

First, government must be careful that it does not facilitate increases in inequality by rewarding the wealthy with special concessions. Where governments dispose of assets or allocate licences, there is a compelling case for more use of auctions to which all have access. Where government provides insurance implicitly or explicitly, premiums must be set as much as possible on a market basis rather than in consultation with the affected industry. A general posture for government of standing up for capitalism rather than particular well-connected capitalists would also serve to mitigate inequality.

Second, there is scope for pro-fairness, pro-growth tax reform. When there are more and more great fortunes being created and the government is in larger and larger deficit, it is hardly a time for the estate tax to be eviscerated. With smaller families and ever more bifurcation in the investment opportunities open to those with wealth, there is a real risk that the old notion of “shirtsleeves to shirtsleeves in three generations” will become obsolete, and those with wealth will endow dynasties.

Third, the public sector must insure that there is greater equity in areas of the most fundamental importance. It will always be the case in a market economy that some will have mansions, art and the ability to travel in lavish fashion. What is more troubling is that the ability of the children of middle-class families to attend college has been seriously compromised by increasing tuition fees and sharp cutbacks at public universities and colleges.

At the same time, in many parts of the country a gap has opened between the quality of the private school education offered to the children of the rich and the public school educations enjoyed by everyone else. Most alarming is the near doubling over the last generation in the gap between the life expectancy of the affluent and the ordinary.

Neither the politics of polarisation nor those of noblesse oblige will serve to protect the interests of the middle class in the post-industrial economy. We will have to find ways to do better.

On the poverty of econometric assumptions (wonkish)

24 June, 2013 at 21:39 | Posted in Statistics & Econometrics | Leave a comment

[T]he authors take as their text a principle of Haavelmo that every testable economic theory should provide a precise formulation of the joint probability distribution of all observable variables to which it refers. It can be argued, however, that Haavelmo’s principle is sounder than the program for realizing it worked out in this book. For, as noted above, what we are asked to assume is that the precept can be carried out in economics by techniques which are established for linear systems, serially independent disturbances, error-free observations, and samples of a size not generally obtainable in economic time series today. In view of such limitations, anyone using these techniques must find himself appealing at every stage less to what theory is saying to him than to what solvability requirements demand of him. Certain it is that the empirical work of this school yields numerous instances in which open questions of economics are resolved in a way that saves a mathematical theorem.
AssumptionsStill, there are doubtless many who will be prepared to make the assumptions required by this theory on pragmatic grounds. We cannot know in advance how well or badly they will work, and they commend themselves on the practical test of convenience. Moreover, as the authors point out, a great many models are compatible with what we know in economics – that is to say, do not violate any matters on which economists are agreed. Attractive as this view is, it fails to draw a necessary distinction between what is assumed and what is merely proposed as hypothesis. This distinction is forced upon us by an obvious but neglected fact of statistical theory: the matters “assumed” are put wholly beyond test, and the entire edifice of conclusions (e.g., about identifiability, optimum properties of the estimates, their sampling distributions, etc.) depends absolutely on the validity of these assumptions. The great merit of modern statistical inference is that it makes exact and efficient use of what we know about reality to forge new tools of discovery, but it teaches us painfully little about the efficacy of these tools when their basis of assumptions is not satisfied. It may be that the approximations involved in the present theory are tolerable ones; only repeated attempts to use them can decide that issue. Evidence exists that trials in this empirical spirit are finding a place in the work of the econometric school, and one may look forward to substantial changes in the methodological presumptions that have dominated this field until now.

Millard Hastay

Evenstar

21 June, 2013 at 14:56 | Posted in Varia | Leave a comment

 

Neoclassical economics – emperor without clothes

20 June, 2013 at 18:54 | Posted in Economics | 1 Comment

Almost a century and a half after Léon Walras founded neoclassical general equilibrium theory, economists still have not been able to show that markets move economies to equilibria.

We do know that – under very restrictive assumptions – equilibria do exist, are unique and are Pareto-efficient. After reading Franklin M. Fisher‘s masterly article The stability of general equilibrium – what do we know and why is it important? one, however, has to ask oneself – what good does that do?

As long as we cannot show, except under exceedingly special assumptions, that there are convincing reasons to suppose there are forces which lead economies to equilibria – the value of general equilibrium theory is nil. As long as we cannot really demonstrate that there are forces operating – under reasonable, relevant and at least mildly realistic conditions – at moving markets to equilibria, there cannot really be any sustainable reason for anyone to pay any interest or attention to this theory.

A stability that can only be proved by assuming “Santa Claus” conditions is of no avail. Most people do not believe in Santa Claus anymore. And for good reasons. Santa Claus is for kids, and general equilibrium economists ought to grow up, leaving their Santa Claus economics in the dustbin of history.

Continuing to model a world full of agents behaving as economists – “often wrong, but never uncertain” – and still not being able to show that the system under reasonable assumptions converges to equilibrium (or simply assume the problem away), is a gross misallocation of intellectual resources and time.

In case you think, even for a moment, that drawing this dismal conclusion is just an idiosyncracy of yours truly and other heterodox economists, you better think twice! Here is what one of the world’s greatest microeconomists – Alan Kirman  – writes in his thought provoking paper The intrinsic limits of modern economic theory:

If one maintains the fundamentally individualistic approach to constructing economic models no amount of attention to the walls will prevent the citadel from becoming empty. Empty in the sense that one cannot expect it to house the elements of a scientific theory, one capable of producing empirically falsifiable propositions …
kirman
Starting from ‘badly behaved’ individuals, we arrive at a situation in which not only is aggregate demand a nice function but, by a result of Debreu, equilibrium will be ‘locally unique. Whilst this means that at least there is some hope for local stability, the real question is, can we hope to proceed and obtain global uniqueness and stability?

The unfortunate answer is a categorical no! [The results of Sonnenchein (1972), Debreu (1974), Mantel (1976) and Mas Collel (1985)] shows clearly why any hope for uniqueness or stability must be unfounded … There is no hope that making the distribution of preferences or income ‘not to dispersed’ or ‘single peaked’ will help us to avoid the fundamental problem …

The problem seems to be embodied in what is an essential feature of a centuries-long tradition in economics, that of treating individuals as acting independently of each other …

To argue in this way suggests … that once the appropriate signals are given, individuals behave in isolation and the result of their behaviour may simply be added together …

The idea that we should start at the level of the isolated individual is one which we may well have to abandon … we should be honest from the outset and assert simply that by assumption we postulate that each sector of the economy behaves as one individual and not claim any spurious microjustification …

Economists therefore should not continue to make strong assertions about this behaviour based on so-called general equilibrium models which are, in reality, no more than special examples with no basis in economic theory as it stands.

From a macroeconomic point of view, the arguments of Fisher and Kirman also show why New Classical, Real Business Cycles, Dynamic Stochastic General Equilibrium (DSGE) and “New Keynesian” microfounded macromodels are such bad substitutes for real macroeconomic analysis.

These models try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And – worse still – something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since the Sonnenschein-Mantel-Debreu theorem unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.

Opting for cloned representative agents that are all identical is of course not a real solution to the fallacy of composition that the theorem points to. After all – as Nobel laureate Robert Solow noted in “The State of Macroeconomics” (Journal of Economic Perspectives 2008:243-249) – “a modern economy is populated by consumers, workers, pensioners, owners, managers, investors, entrepreneurs, bankers, and others, with different and sometimes conflicting desires, information, expectations, capacities, beliefs, and rules of behavior.” So, representative agent models are rather an evasion whereby issues of distribution, coordination, heterogeneity – everything that really defines macroeconomics – are swept under the rug.

Conclusion – don’t believe a single thing of what these microfounders tell you until they have told you how they have coped with – not evadedSonnenschein-Mantel-Debreu!

Of course, most neoclassical macroeconomists know that to use a representative agent is a flagrantly illegitimate method of ignoring real aggregation issues. They keep on with their business, nevertheless, just because it significantly simplifies what they are doing. It reminds – not so little – of the drunkard who has lost his keys in some dark place and deliberately chooses to look for them under a neighbouring street light just because it is easier to see there!

Next Page »

Create a free website or blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.