Non-ergodicity and time irreversibility (wonkish)
30 Jun, 2013 at 19:45 | Posted in Economics, Statistics & Econometrics | 1 CommentAs yours truly has argued – e.g. here, here and here – time irreversibility and non-ergodicity are extremely important issues for understanding what are the deep fundamental flaws of mainstream neoclassical economics in general.
Ole Peters presentation at Gresham College gives further evidence why expectation values are irrelevant for understanding economic systems in specific.
Gesell och Keynes om räntans bestämningsfaktorer
30 Jun, 2013 at 12:12 | Posted in Economics | Comments Off on Gesell och Keynes om räntans bestämningsfaktorerI Silvio Gesells Die natürliche Wirtschaftsordnung (1916) pekar författaren på pengarnas benägenhet att sänka kostnader för varuutbyte (vad som i moderna termer kan kallas deras förmåga att sänka transaktionskostnader). Emellertid gäller också att pengar till skillnad från varor är lätta att lagra eftersom de inte ”rostar”. De fungerar alltså inte bara som bytesmedel, utan också som värdebevarare.
Pengarnas användbarhet och smidighet leder till en efterfrågan att disponera dem, vilket är orsaken till att det existerar ränta. Gesell menar att bruttoräntan innehåller en riskpremie för utlånaren och att det också finns en ”hausse-premie” som ska ersätta den spekulationsvinst (genom att köpa och därefter sälja varaktiga varor) som utlånaren anser sig gå miste om genom att i stället låna ut sina pengar.
Tar man bort dessa båda pålägg, återstår vad Gesell kallar ”urräntan”. Den motsvarar i huvudsak vad som annars kallas ren ränta eller nettoränta. Gesell menar sig ha historiskt stöd för att urräntan har varit ungefär densamma sedan pengarnas uppkomst, nämligen 3-5 procent per år. Den grundar sig nämligen på pengarnas inneboende fördelar för innehavaren.
Räntan – som liksom hos John Maynard Keynes främst ses som en betalning för att motverka tendensen att spara i madrassen – är inte Gesells primära måltavla. Den ses i stället som ett symptom på det som enligt Gesell är grundproblemet – pengarnas funktion som kapital. Räntan är emellertid ett problem därför att den styr inkomstflöden till dem som har pengar. Detta anses dels vara orättvist, dels leda till att man inte får någon riktig avstämning mellan produktion och konsumtion på varumarknaden. Personer som lever på ränteinkomster har nämligen en överskottslikviditet (överskott på pengar) som inte direkt används till varuinköp. Detta leder till en osäker avsättning för varulagren och därigenom till kriser med sjunkande varupriser och arbetslöshet.
Read more …
Envoi
29 Jun, 2013 at 10:46 | Posted in Varia | Comments Off on Envoi
Som mareld tindrar en stjärna, släcks och tänds
och släcks och tänds igen. De dallrande djupen bär den
Så har jag stått vid hundrade Land´s Ends
och tänkt på vad jag vill och vad jag skall i världenDet ena vore väl: att vara som man är
Det andra är väl: att mot udden spjärna
Och hade jag den skarpa udden mindre kär
så vore jag som andra, mer än gärnaSomligas väsen är: vara. Andras: att vara förutan
Vägar har inget mål. Det är stigar som leder dit
Såg du ett fönster lysa? Tänkte du knacka på rutan?
Din är en månskensväg, som slingrar i dyningen, vit.
(På youtube går det att återuppleva Tone Bengtssons underbart fina tv-porträtt av Gunnar Ekelöf från 1995: http://youtu.be/NwEJCvniooA
h/t Jan Milch)
Read my lips – significance testing is no substitute for doing real science!
28 Jun, 2013 at 12:26 | Posted in Statistics & Econometrics | Comments Off on Read my lips – significance testing is no substitute for doing real science!Jager and Leek may well be correct in their larger point, that the medical literature is broadly correct. But I don’t think the statistical framework they are using is appropriate for the questions they are asking. My biggest problem is the identification of scientific hypotheses and statistical “hypotheses” of the “theta = 0″ variety.
Based on the word “empirical” title, I thought the authors were going to look at a large number of papers with p-values and then follow up and see if the claims were replicated. But no, they don’t follow up on the studies at all! What they seem to be doing is collecting a set of published p-values and then fitting a mixture model to this distribution, a mixture of a uniform distribution (for null effects) and a beta distribution (for non-null effects). Since only statistically significant p-values are typically reported, they fit their model restricted to p-values less than 0.05. But this all assumes that the p-values have this stated distribution. You don’t have to be Uri Simonsohn to know that there’s a lot of p-hacking going on. Also, as noted above, the problem isn’t really effects that are exactly zero, the problem is that a lot of effects are lots in the noise and are essentially undetectable given the way they are studied.
Jager and Leek write that their model is commonly used to study hypotheses in genetics and imaging. I could see how this model could make sense in those fields … but I don’t see this model applying to published medical research, for two reasons. First … I don’t think there would be a sharp division between null and non-null effects; and, second, there’s just too much selection going on for me to believe that the conditional distributions of the p-values would be anything like the theoretical distributions suggested by Neyman-Pearson theory.
So, no, I don’t at all believe Jager and Leek when they write, “we are able to empirically estimate the rate of false positives in the medical literature and trends in false positive rates over time.” They’re doing this by basically assuming the model that is being questioned, the textbook model in which effects are pure and in which there is no p-hacking.
Indeed. If anything, this underlines how important it is not to equate science with statistical calculations. All science entail human judgement, and using statistical models doesn’t relieve us of that necessity. Working with misspecified models, the scientific value of significance testing is actually zero – even though you’re making valid statistical inferences! Statistical models and concomitant significance tests are no substitutes for doing real science. Or as a noted German philosopher once famously wrote:
There is no royal road to science, and only those who do not dread the fatiguing climb of its steep paths have a chance of gaining its luminous summits.
De rättfärdiga
28 Jun, 2013 at 11:06 | Posted in Varia | 1 Comment
Här stå de nu, pampiga, värdiga,
mens marknadsfiolerna skorra –
De äro de tio rättfärdiga
som söktes en gång i Gomorra.Alls inte så goda att rå på;
de lyfter sej själva med orden.
Och hade de något att stå på
så lyfte de hela jorden.Förunderligt friska i hjärnorna;
människoslukare.
Talar om solen och stjärnorna
för åkerbrukare.Sen bärgar de sitt på det torra,
pampiga, värdiga –
Pampiga tio rättfärdiga
som förstenade Gud och Gomorra.
Nils Ferlin: Tio rättfärdiga
Listen to Larry, Greg!
25 Jun, 2013 at 10:43 | Posted in Politics & Society | Comments Off on Listen to Larry, Greg!
Lawrence Summers listening to Greg Mankiw’s explications on the one percent?
Even though the interest may not be reciprocated, it would obviously be a good idea for Greg Mankiw to listen to his Harvard colleague Lawrence Summers, instead of trivializing the problems created by increasing inequality! Summers has some interesting thoughts on why income inequality is on the rise and what to do about it:
Why has the top 1 per cent of the population done so well relative to the rest? The answer probably lies substantially in changing technology and globalisation. When George Eastman revolutionised photography, he did very well and, because he needed a large number of Americans to carry out his vision, the city of Rochester had a thriving middle class for two generations. By contrast, when Steve Jobs revolutionised personal computing, he and the shareholders in Apple (who are spread all over the world) did very well but a much smaller benefit flowed to middle-class American workers both because production was outsourced and because the production of computers and software was not terribly labour intensive …
What then is the right response to rising inequality? There are too few good ideas in current political discourse and the development of better ones is crucial. Here are three.
First, government must be careful that it does not facilitate increases in inequality by rewarding the wealthy with special concessions. Where governments dispose of assets or allocate licences, there is a compelling case for more use of auctions to which all have access. Where government provides insurance implicitly or explicitly, premiums must be set as much as possible on a market basis rather than in consultation with the affected industry. A general posture for government of standing up for capitalism rather than particular well-connected capitalists would also serve to mitigate inequality.
Second, there is scope for pro-fairness, pro-growth tax reform. When there are more and more great fortunes being created and the government is in larger and larger deficit, it is hardly a time for the estate tax to be eviscerated. With smaller families and ever more bifurcation in the investment opportunities open to those with wealth, there is a real risk that the old notion of “shirtsleeves to shirtsleeves in three generations” will become obsolete, and those with wealth will endow dynasties.
Third, the public sector must insure that there is greater equity in areas of the most fundamental importance. It will always be the case in a market economy that some will have mansions, art and the ability to travel in lavish fashion. What is more troubling is that the ability of the children of middle-class families to attend college has been seriously compromised by increasing tuition fees and sharp cutbacks at public universities and colleges.
At the same time, in many parts of the country a gap has opened between the quality of the private school education offered to the children of the rich and the public school educations enjoyed by everyone else. Most alarming is the near doubling over the last generation in the gap between the life expectancy of the affluent and the ordinary.
Neither the politics of polarisation nor those of noblesse oblige will serve to protect the interests of the middle class in the post-industrial economy. We will have to find ways to do better.
On the poverty of econometric assumptions (wonkish)
24 Jun, 2013 at 21:39 | Posted in Statistics & Econometrics | Comments Off on On the poverty of econometric assumptions (wonkish)[T]he authors take as their text a principle of Haavelmo that every testable economic theory should provide a precise formulation of the joint probability distribution of all observable variables to which it refers. It can be argued, however, that Haavelmo’s principle is sounder than the program for realizing it worked out in this book. For, as noted above, what we are asked to assume is that the precept can be carried out in economics by techniques which are established for linear systems, serially independent disturbances, error-free observations, and samples of a size not generally obtainable in economic time series today. In view of such limitations, anyone using these techniques must find himself appealing at every stage less to what theory is saying to him than to what solvability requirements demand of him. Certain it is that the empirical work of this school yields numerous instances in which open questions of economics are resolved in a way that saves a mathematical theorem.
Still, there are doubtless many who will be prepared to make the assumptions required by this theory on pragmatic grounds. We cannot know in advance how well or badly they will work, and they commend themselves on the practical test of convenience. Moreover, as the authors point out, a great many models are compatible with what we know in economics – that is to say, do not violate any matters on which economists are agreed. Attractive as this view is, it fails to draw a necessary distinction between what is assumed and what is merely proposed as hypothesis. This distinction is forced upon us by an obvious but neglected fact of statistical theory: the matters “assumed” are put wholly beyond test, and the entire edifice of conclusions (e.g., about identifiability, optimum properties of the estimates, their sampling distributions, etc.) depends absolutely on the validity of these assumptions. The great merit of modern statistical inference is that it makes exact and efficient use of what we know about reality to forge new tools of discovery, but it teaches us painfully little about the efficacy of these tools when their basis of assumptions is not satisfied. It may be that the approximations involved in the present theory are tolerable ones; only repeated attempts to use them can decide that issue. Evidence exists that trials in this empirical spirit are finding a place in the work of the econometric school, and one may look forward to substantial changes in the methodological presumptions that have dominated this field until now.
Neoclassical economics – emperor without clothes
20 Jun, 2013 at 18:54 | Posted in Economics | 1 CommentAlmost a century and a half after Léon Walras founded neoclassical general equilibrium theory, economists still have not been able to show that markets move economies to equilibria.
We do know that – under very restrictive assumptions – equilibria do exist, are unique and are Pareto-efficient. After reading Franklin M. Fisher‘s masterly article The stability of general equilibrium – what do we know and why is it important? one, however, has to ask oneself – what good does that do?
As long as we cannot show, except under exceedingly special assumptions, that there are convincing reasons to suppose there are forces which lead economies to equilibria – the value of general equilibrium theory is nil. As long as we cannot really demonstrate that there are forces operating – under reasonable, relevant and at least mildly realistic conditions – at moving markets to equilibria, there cannot really be any sustainable reason for anyone to pay any interest or attention to this theory.
A stability that can only be proved by assuming “Santa Claus” conditions is of no avail. Most people do not believe in Santa Claus anymore. And for good reasons. Santa Claus is for kids, and general equilibrium economists ought to grow up, leaving their Santa Claus economics in the dustbin of history.
Continuing to model a world full of agents behaving as economists – “often wrong, but never uncertain” – and still not being able to show that the system under reasonable assumptions converges to equilibrium (or simply assume the problem away), is a gross misallocation of intellectual resources and time.
In case you think, even for a moment, that drawing this dismal conclusion is just an idiosyncracy of yours truly and other heterodox economists, you better think twice! Here is what one of the world’s greatest microeconomists – Alan Kirman – writes in his thought provoking paper The intrinsic limits of modern economic theory:
If one maintains the fundamentally individualistic approach to constructing economic models no amount of attention to the walls will prevent the citadel from becoming empty. Empty in the sense that one cannot expect it to house the elements of a scientific theory, one capable of producing empirically falsifiable propositions …
Starting from ‘badly behaved’ individuals, we arrive at a situation in which not only is aggregate demand a nice function but, by a result of Debreu, equilibrium will be ‘locally unique. Whilst this means that at least there is some hope for local stability, the real question is, can we hope to proceed and obtain global uniqueness and stability?The unfortunate answer is a categorical no! [The results of Sonnenchein (1972), Debreu (1974), Mantel (1976) and Mas Collel (1985)] shows clearly why any hope for uniqueness or stability must be unfounded … There is no hope that making the distribution of preferences or income ‘not to dispersed’ or ‘single peaked’ will help us to avoid the fundamental problem …
The problem seems to be embodied in what is an essential feature of a centuries-long tradition in economics, that of treating individuals as acting independently of each other …
To argue in this way suggests … that once the appropriate signals are given, individuals behave in isolation and the result of their behaviour may simply be added together …
The idea that we should start at the level of the isolated individual is one which we may well have to abandon … we should be honest from the outset and assert simply that by assumption we postulate that each sector of the economy behaves as one individual and not claim any spurious microjustification …
Economists therefore should not continue to make strong assertions about this behaviour based on so-called general equilibrium models which are, in reality, no more than special examples with no basis in economic theory as it stands.
From a macroeconomic point of view, the arguments of Fisher and Kirman also show why New Classical, Real Business Cycles, Dynamic Stochastic General Equilibrium (DSGE) and “New Keynesian” microfounded macromodels are such bad substitutes for real macroeconomic analysis.
These models try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And – worse still – something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since the Sonnenschein-Mantel-Debreu theorem unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.
Opting for cloned representative agents that are all identical is of course not a real solution to the fallacy of composition that the theorem points to. After all – as Nobel laureate Robert Solow noted in “The State of Macroeconomics” (Journal of Economic Perspectives 2008:243-249) – “a modern economy is populated by consumers, workers, pensioners, owners, managers, investors, entrepreneurs, bankers, and others, with different and sometimes conflicting desires, information, expectations, capacities, beliefs, and rules of behavior.” So, representative agent models are rather an evasion whereby issues of distribution, coordination, heterogeneity – everything that really defines macroeconomics – are swept under the rug.
Conclusion – don’t believe a single thing of what these microfounders tell you until they have told you how they have coped with – not evaded – Sonnenschein-Mantel-Debreu!
Of course, most neoclassical macroeconomists know that to use a representative agent is a flagrantly illegitimate method of ignoring real aggregation issues. They keep on with their business, nevertheless, just because it significantly simplifies what they are doing. It reminds – not so little – of the drunkard who has lost his keys in some dark place and deliberately chooses to look for them under a neighbouring street light just because it is easier to see there!
Suggestion for Mankiw’s reading list
20 Jun, 2013 at 12:28 | Posted in Economics, Politics & Society | 1 CommentThe myth is of a dynamic, creative, colourful, entrepreneurial private sector, that at most needs ‘unleashing’ from its constraints from the public sector. The latter is instead depicted as necessary for fixing ‘market failures’ (investing in ‘public goods’ like infrastructure or basic research) but inherently bureaucratic, slow, grey, and often too ‘meddling’. It is told to stick to the ‘basics’ but to avoid getting too directly involved in the economy …
All this fear about the government trying and failing to pick winners is exaggerated. Both Apple and the technologies behind the iPhone were picked! But picking winners is more probable when the state is described as though it is relevant rather than irrelevant …
Today, we see countries that are growing thanks to a courageous public sector and through mission oriented policies. For example, China is spending $1.7 trillion on five key new broadly defined sectors, including ‘environmentally friendly’ technologies. Brazil’s active state investment bank is spending more than $60 billion just this year on green technology. The economics profession doesn’t adequately account for this kind of state-led activity, but only warns of governments ‘crowding out’ private business or failing at picking winners …
The problem is that by not admitting this entrepreneurial risk-taking role that the state provides, we have not confronted a key relationship in finance: the relationship between risk and return. Innovation is deeply uncertain, with most attempts failing. For every Internet there are many Concordes or Solyndras. Yet this is also true for private venture capital (VC). But while private VC is then able to use the profits from the 1 out of 10 successes to fund the 9 losses, the state has not been allowed to reap a return. Economists think this will happen via tax (from the jobs created, and from the profits of the companies), yet so many of the companies that receive such benefits from state funding, bring their jobs elsewhere, and of course we know they also pay very little tax. Thus the return generating mechanisms must be rethought. It could be done through retaining equity, a ‘golden share’ of the intellectual property rights, or through income contingent loans …
What this means is that we have socialized the risk of innovation but privatised the rewards. This dynamic is one of the key drivers of increasing inequality. Because innovation today builds on innovation tomorrow, the ‘capture’ can be very large. This would not be the case if innovation were just a random walk. Policy makers must think very hard how to make value creation activities (done by all the collective actors in the innovation game) rewarded above value extraction activities (in this sense capital gains taxes are way too low). And since the booty from the latter can be very large, redirecting incentives and rewards towards the value creators is essential. The problem is that some of the ‘extractors’ like to sell themselves as the creators.
Economics and information
19 Jun, 2013 at 11:37 | Posted in Varia | 1 Comment
A balloonist, lost, sees someone walking down a country lane. The balloonist lowers the balloon and shouts down to the the walker:
“Where am I?”
“About 20 feet above the ground,” comes the reply.
After a moment’s pondering, the balloonist says:
“You must be an economist.”
“How did you know?”
“Your information is perfectly correct and totally useless.”
A quick refresher on ergodicity (student stuff)
18 Jun, 2013 at 13:10 | Posted in Statistics & Econometrics | Comments Off on A quick refresher on ergodicity (student stuff)
Neoclassical economics – updating a debate
17 Jun, 2013 at 22:04 | Posted in Economics, Theory of Science & Methodology | Comments Off on Neoclassical economics – updating a debateNoah Smith has an update today on his blog responding to my critique of his post on what neoclassical economics is.
Noah starts by citing the following part of my article:
The basic problem with this definition of neoclassical economics – basically arguing that the differentia specifica of neoclassical economics is its use of demand and supply, utility maximization and rational choice – is that it doesn’t get things quite right. As we all know, there is an endless list of mainstream models that more or less distance themselves from one or the other of these characteristics. So the heart of neoclassical economic theory lies elsewhere.
He then says:
This is exactly the claim that “neoclassical” = “mainstream”. The clear implication of Syll’s syllogism is that no matter what sort of innovations mainstream economic theory embrace, no matter what old methods it discards, no matter what revolutions it undergoes, whatever it produces will be defined as “neoclassical” simply because it is in the mainstream. To me, that is clearly a counterproductive way of thinking about the world.
However, I would maintain, it is a rather unwarranted conclusion, since in the section directly after the one Smith cites, I expressly write:
The essence of neoclassical economic theory is its exclusive use of a deductivist Euclidean methodology. A methodology that is more or less imposed as constituting economics, and, usually, without a smack of argument.
The theories and models that neoclassical economists construct describe imaginary worlds using a combination of formal sign systems such as mathematics and ordinary language. The descriptions made are extremely thin and to a large degree disconnected to the specific contexts of the targeted system than one (usually) wants to (partially) represent. This is not by chance. These closed formalistic-mathematical theories and models are constructed for the purpose of being able to deliver purportedly rigorous deductions that may somehow by be exportable to the target system. By analyzing a few causal factors in their “laboratories” they hope they can perform “thought experiments” and observe how these factors operate on their own and without impediments or confounders.
Unfortunately, this is not so. The reason for this is that economic causes never act in a socio-economic vacuum. Causes have to be set in a contextual structure to be able to operate. This structure has to take some form or other, but instead of incorporating structures that are true to the target system, the settings made in economic models are rather based on formalistic mathematical tractability. In the models they appear as unrealistic assumptions, usually playing a decisive role in getting the deductive machinery deliver “precise” and “rigorous” results. This, of course, makes exporting to real world target systems problematic, since these models – as part of a deductivist covering-law tradition in economics – are thought to deliver general and far-reaching conclusions that are externally valid. But how can we be sure the lessons learned in these theories and models have external validity, when based on highly specific unrealistic assumptions? As a rule, the more specific and concrete the structures, the less generalizable the results. Admitting that we in principle can move from (partial) falsehoods in theories and models to truth in real world target systems does not take us very far, unless a thorough explication of the relation between theory, model and the real world target system is made. If models assume representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. To have a deductive warrant for things happening in a closed model is no guarantee for them being preserved when applied to an open real world target system.
So my argumentation is not that everything that mainstream economists do have to be neoclassical simply because it is in the mainstream. My argumentation is about trying to delineate what is the core of a tenable scientific definition of neoclassical economics. And as long as mainstream economists – of whatever ilk – more or less – explicitly or not – subscribe to this core, I can’t see why it should be considered wrong to continue labeling them neoclassical economists.
A quick refresher on mathematical proof techniques (student stuff)
16 Jun, 2013 at 14:26 | Posted in Theory of Science & Methodology | Comments Off on A quick refresher on mathematical proof techniques (student stuff)
Neoclassical economics – the true picture
15 Jun, 2013 at 12:12 | Posted in Varia | 4 CommentsNoah Smith’s picture of neoclassical economics looks like this:
In reality, I would argue, it looks more like this:
A quick refresher on mathematical induction (student stuff)
15 Jun, 2013 at 10:28 | Posted in Statistics & Econometrics | Comments Off on A quick refresher on mathematical induction (student stuff)
Blog at WordPress.com.
Entries and Comments feeds.