What the euro is all about

30 Jun, 2018 at 09:58 | Posted in Economics | 3 Comments

There are still economists and politicians out there who think that the euro is the only future for Europe. However, there seem to be some rather basic facts about optimal currency areas that it would perhaps be wise to consider …

The idea that the euro has “failed” is dangerously naive. The euro is doing exactly what its progenitor – and the wealthy 1%-ers who adopted it – predicted and planned for it to do.

suThat progenitor is former University of Chicago economist Robert Mundell. The architect of “supply-side economics” is now a professor at Columbia University, but I knew him through his connection to my Chicago professor, Milton Friedman …

The euro would really do its work when crises hit, Mundell explained. Removing a government’s control over currency would prevent nasty little elected officials from using Keynesian monetary and fiscal juice to pull a nation out of recession.

“It puts monetary policy out of the reach of politicians,” he said. “[And] without fiscal policy, the only way nations can keep jobs is by the competitive reduction of rules on business” …

As another Nobelist, Paul Krugman, notes, the creation of the eurozone violated the basic economic rule known as “optimum currency area”. This was a rule devised by Bob Mundell.

reaganomicsThat doesn’t bother Mundell. For him, the euro wasn’t about turning Europe into a powerful, unified economic unit. It was about Reagan and Thatcher …

Mundell explained to me that, in fact, the euro is of a piece with Reaganomics:

“Monetary discipline forces fiscal discipline on the politicians as well.”

And when crises arise, economically disarmed nations have little to do but wipe away government regulations wholesale, privatize state industries en masse, slash taxes and send the European welfare state down the drain.

Greg Palast/The Guardian

Paul Krugman — a math-is-the-message-modeler

29 Jun, 2018 at 09:18 | Posted in Economics | 1 Comment

In a post on his blog, Paul Krugman argues that ‘Keynesian’ macroeconomics more than anything else “made economics the model-oriented field it has become.” In Krugman’s eyes, Keynes was a “pretty klutzy modeler,” and it was only thanks to Samuelson’s famous 45-degree diagram and Hicks’ IS-LM that things got into place. Although admitting that economists have a tendency to use ”excessive math” and “equate hard math with quality” he still vehemently defends — and always has — the mathematization of economics:

I’ve seen quite a lot of what economics without math and models looks like — and it’s not good.

Sure, ‘New Keynesian’ economists like Mankiw and Krugman — and their forerunners, ‘Keynesian’ economists like Paul Samuelson and (young) John Hicks — certainly have contributed to making economics more mathematical and “model-oriented.”

65547036But if these math-is-the-message-modelers aren’t able to show that the mechanisms or causes that they isolate and handle in their mathematically formalized macromodels also are applicable to the real world, these mathematical models are of limited value to our understandings of real-world​ economies.

When it comes to modeling philosophy, Krugman defends his position in the following words (my italics):

I don’t mean that setting up and working out microfounded models is a waste of time. On the contrary, trying to embed your ideas in a microfounded model can be a very useful exercise — not because the microfounded model is right, or even better than an ad hoc model, but because it forces you to think harder about your assumptions, and sometimes leads to clearer thinking. In fact, I’ve had that experience several times.

The argument is hardly convincing. If people put that enormous amount of time and energy that they do into constructing macroeconomic models, then they really have to be substantially contributing to our understanding and ability to explain and grasp real macroeconomic processes.

For years Krugman has in more than one article criticized mainstream economics for using too much (bad) mathematics and axiomatics in their model-building endeavours. But when it comes to defending his own position on various issues he usually himself ultimately falls back on the same kind of models. In his End This Depression Now — just to take one example — Krugman maintains that although he doesn’t buy “the assumptions about rationality and markets that are embodied in many modern theoretical models, my own included,” he still find them useful “as a way of thinking through some issues carefully.” When it comes to methodology and assumptions, Krugman obviously has a lot in common with the kind of model-building he otherwise criticizes.

If macroeconomic models – no matter of what ilk – make assumptions, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypotheses can be bridged, are obviously non-justifiable.

A gadget is just a gadget — and brilliantly silly simple models — IS-LM included — do not help us working with the fundamental issues of modern economies any more than brilliantly silly complicated models — calibrated DSGE and RBC models included.

Krugman’s formalization schizophrenia

28 Jun, 2018 at 10:17 | Posted in Economics | 7 Comments

In an article published last week, Nicholas Gruen criticized the modern vogue of formalization in economics and took Paul Krugman as an example:

He’s saying first that economists can’t see what isn’t in their models – whereas Hicks and pretty much every economist until the late twentieth century would have understood the need for careful and ongoing reconciliation of formal modelling and other sources of knowledge. More shockingly he’s saying that those who smell a rat at the dysfunctionality of all this should just get over themselves. To quote Krugman:

“You may not like this tendency; certainly economists tend to be too quick to dismiss what has not been formalized (although I believe that the focus on models is basically right).”

It’s ironic given how compellingly Krugman has documented the regression of macroeconomics in the same period that saw his own rise via new trade theory. I think both retrogressions were driven by formalisation at all costs, though, in the case of new classical macro, this mindset gave additional licence to the motivated reasoning of the libertarian right. In each case, economics regresses into scholastic abstractions, and obviously important parts of the story slide pristine invisibility to the elect.

Responding to the article, Paul Krugman yesterday rode out to defend formalism in economics:

forWhat about new trade theory? What us new trade theorists did was say, “It looks as if there’s a lot going on in world trade that can’t be explained in existing formal models. So let’s see if there’s a different approach to modeling that can make sense of what we see” …

Now, we can argue about how much good this formalization did. I still believe that the formal models provided a level of clarity and legitimacy to trade discussion that wasn’t there before.

Now, this is not — as shown in Gruen’s article — the first time Krugman has felt the urge to defend mainstream formalization. In another post up on his blog, Krugman argues that the ‘discipline of modeling’ is a sine qua non for tackling politically and emotionally charged economic issues:

economist-nakedIn my experience, modeling is a helpful tool (among others) in avoiding that trap, in being self-aware when you’re starting to let your desired conclusions dictate your analysis. Why? Because when you try to write down a model, it often seems to lead some place you weren’t expecting or wanting to go. And if you catch yourself fiddling with the model to get something else out of it, that should set off a little alarm in your brain.

So when Krugman and other ‘modern’ mainstream economists use their models — standardly assuming rational expectations, Walrasian market clearing, unique equilibria, time invariance, linear separability and homogeneity of both inputs/outputs and technology, infinitely lived intertemporally optimizing representative agents with homothetic and identical preferences, etc. — and standardly ignoring complexity, diversity, uncertainty, coordination problems, non-market clearing prices, real aggregation problems, emergence, expectations formation, etc. — we are supposed to believe that this somehow helps them ‘to avoid motivated reasoning that validates what you want to hear’ and provide ‘legitimacy.’

Yours truly is, to say the least,straight jacket far from convinced. The alarm that sets off in my brain is that this, rather than being helpful for understanding real-world economic issues, sounds more like an ill-advised plaidoyer for voluntarily taking on a methodological straight-jacket of unsubstantiated and known to be false assumptions.

Krugman, as we all know, can at times be a harsh critique of economic formalism. That is, other people’s formalism. His own, and other ‘New Keynesians’ formalizations, he always seems to find some handy justification for.

Contrary to the impression Krugman wants to convey, his and other ‘New Keynesians’ modelling strategy has a lot in common with that of people like Robert Lucas and Thomas Sargent. ‘New Keynesian’ macroeconomic models build on Real Business Cycle foundations,  regularly assuming representative actors, rational expectations, market clearing and equilibrium. But if we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable.,

IQ tests, genetics, and the Flynn Effect

28 Jun, 2018 at 09:17 | Posted in Varia | Comments Off on IQ tests, genetics, and the Flynn Effect

 

The main reason why almost all econometric models are wrong

27 Jun, 2018 at 11:59 | Posted in Statistics & Econometrics | 3 Comments

Since econometrics doesn’t content itself with only making optimal predictions, but also aspires to explain things in terms of causes and effects, econometricians need loads of assumptions — most important of these are additivity and linearity. Important, simply because if they are not true, your model is invalid and descriptively incorrect.  And when the model is wrong — well, then it’s wrong.

The assumption of additivity and linearity means that the outcome variable is, in reality, linearly related to any predictors … and that if you have several predictors then their combined effect is best described by adding their effects together …

catdogThis assumption is the most important because if it is not true then even if all other assumptions are met, your model is invalid because you have described it incorrectly. It’s a bit like calling your pet cat a dog: you can try to get it to go in a kennel, or to fetch sticks, or to sit when you tell it to, but don’t be surprised when its behaviour isn’t what you expect because even though you’ve called it a dog, it is in fact a cat. Similarly, if you have described your statistical model inaccurately it won’t behave itself and there’s no point in interpreting its parameter estimates or worrying about significance tests of confidence intervals: the model is wrong.

Andy Field

Let me take the opportunity to elaborate a little on why I find these assumptions of such paramount importance and ought to be much more argued for — on both epistemological and ontological grounds — if at all being used.

Limiting model assumptions in economic science always have to be closely examined since if we are going to be able to show that the mechanisms or causes that we isolate and handle in our models are stable in the sense that they do not change when we “export” them to our “target systems”, we have to be able  to show that they do not only hold under ceteris paribus conditions and a fortiori only are of limited value to our understanding, explanations or predictions of real economic systems.

Our admiration for technical virtuosity should not blind us to the fact that we have to have a cautious attitude towards probabilistic inferences in economic contexts. We should look out for causal relations, but econometrics can never be more than a starting point in that endeavour since econometric (statistical) explanations are not explanations in terms of mechanisms, powers, capacities or causes. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality. But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive, not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes. A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. A perusal of the leading econom(etr)ic journals shows that most econometricians still concentrate on fixed parameter models and that parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

Real-world social systems are not governed by stable causal mechanisms or capacities. The kinds of “laws” and relations that econometrics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real-world systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made ‘nomological machines’ they are rare, or even non-existent. Unfortunately, that also makes most of the achievements of econometrics – as most of the contemporary endeavours of mainstream economic theoretical modelling – rather useless. No matter how often you call your pet cat a dog, it is still nothing but a cat …

Uncertainty heuristics

27 Jun, 2018 at 09:56 | Posted in Theory of Science & Methodology | Comments Off on Uncertainty heuristics

 

Animal cruelty and human dignity

26 Jun, 2018 at 17:09 | Posted in Politics & Society | 1 Comment

 

pid_1103Throughout European history the idea of the human being has been expressed in contradistinction to the animal. The latter’s lack of reason is the proof of human dignity. So insistently and unanimously has this antithesis been recited … that few other ideas are so fundamental to Western anthropology. The antithesis is acknowledged even today. The behaviorists only appear to have forgotten it. That they apply to human beings the same formulae and results which they wring without restraint from defenseless animals in their abominable physiological laboratories, proclaims the difference in especially subtle way. The conclusion they draw from the mutilated animal bodies applies, not to animals in freedom, but to human beings today. By mistreating animals they announce that they, and only they in the whole of creation, function voluntarily in the same mechanical, blind, automatic way the twitching movements of the bound victims made use of by the expert …

In this world liberated from appearance — in which human beings, having forfeited reflection, have become once more the cleverest animals, which subjugate the rest of the universe when they happen not to be tear­ing themselves apart — to show concern for animals is considered no longer merely sentimental but​ a betrayal of progress.

So much for ‘statistical objectivity’

26 Jun, 2018 at 15:39 | Posted in Statistics & Econometrics | 4 Comments

Last year, we recruited 29 teams of researchers and asked them to answer the same research question with the same data set. Teams approached the data with a wide array of analytical techniques, and obtained highly varied results …

All teams were given the same large data set collected by a sports-statistics firm across four major football leagues. It included referee calls, counts of how often referees encountered each player, and player demographics including team position, height and weight. It also included a rating of players’ skin colour …

unchallengable-statisticsOf the 29 teams, 20 found a statistically significant correlation between skin colour and red cards … Findings varied enormously, from a slight (and non-significant) tendency for referees to give more red cards to light-skinned players to a strong trend of giving more red cards to dark-skinned players …

Had any one of these 29 analyses come out as a single peer-reviewed publication, the conclusion could have ranged from no race bias in referee decisions to a huge bias.

Raphael Silberzahn & Eric Uhlmann

Research that strongly underlines that even in statistics, the researcher has many degrees of freedom. In statistics — as in economics and econometrics — the results we get depend on the assumptions we make in our models. Changing those assumptions — playing a more important role than the data we feed into our models — leed to far-reaching changes in our conclusions. Using statistics​ is no guarantee we get at any ‘objective truth.’

Take full responsibility for your life

26 Jun, 2018 at 10:14 | Posted in Varia | Comments Off on Take full responsibility for your life


Never give in. Never give up.

Lying with statistics

25 Jun, 2018 at 19:25 | Posted in Statistics & Econometrics | Comments Off on Lying with statistics

 

Nonsense is nonsense

25 Jun, 2018 at 16:32 | Posted in Economics | 3 Comments

The Conservative belief that there is some law of nature which prevents men from being employed, that it is “rash” to employ men, and that it is financially ‘sound’ to maintain a tenth of the population in idleness for an indefinite period, is crazily improbable – the sort of thing which no man could believe who had not had his head fuddled with nonsense for years and years … 0616_ig-john-maynard-keynes_1024x576Our main task, therefore, will be to confirm the reader’s instinct that what seems sensible is sensible, and what seems nonsense is nonsense. We shall try to show him that the conclusion, that if new forms of employment are offered more men will be employed, is as obvious as it sounds and contains no hidden snags; that to set unemployed men to work on useful tasks does what it appears to do, namely, increases the national wealth; and that the notion, that we shall, for intricate reasons, ruin ourselves financially if we use this means to increase our well-being, is what it looks like – a bogy.

John Maynard Keynes (1929)

Brave women

25 Jun, 2018 at 15:50 | Posted in Politics & Society | Comments Off on Brave women

Courage is a capability to confront fear, as when in front of the powerful and mighty, not to step back, but stand up for one’s rights not to be humiliated or abused.

Courage is to do the right thing in spite of danger and fear. To keep on even if opportunities to turn back are given.

Dignity, a better life, or justice and rule of law, are things worth fighting for. Not to step back, in spite of confronting the mighty and powerful, creates courageous acts that stay in our memories and means something — as when this Iranian woman refuses to be harassed by self-appointed moral guardians.

The American descent

24 Jun, 2018 at 12:57 | Posted in Politics & Society | Comments Off on The American descent

The speed of America’s moral descent under Donald Trump is breathtaking. In a matter of months we’ve gone from a nation that stood for life, liberty and the pursuit of happiness to a nation that tears children from their parents and puts them in cages.

truWhat’s almost equally remarkable about this plunge into barbarism is that it’s not a response to any actual problem. The mass influx of murderers and rapists that Trump talks about, the wave of crime committed by immigrants here (and, in his mind, refugees in Germany), are things that simply aren’t happening. They’re just sick fantasies being used to justify real atrocities.

And you know what this reminds me of? The history of anti-Semitism, a tale of prejudice fueled by myths and hoaxes that ended in genocide.

Paul Krugman

On the limits of ‘mediation analysis’ and ‘statistical causality’

23 Jun, 2018 at 23:18 | Posted in Statistics & Econometrics | 5 Comments

mediator“Mediation analysis” is this thing where you have a treatment and an outcome and you’re trying to model how the treatment works: how much does it directly affect the outcome, and how much is the effect “mediated” through intermediate variables …

In the real world, it’s my impression that almost all the mediation analyses that people actually fit in the social and medical sciences are misguided: lots of examples where the assumptions aren’t clear and where, in any case, coefficient estimates are hopelessly noisy and where confused people will over-interpret statistical significance …

More and more I’ve been coming to the conclusion that the standard causal inference paradigm is broken … So how to do it? I don’t think traditional path analysis or other multivariate methods of the throw-all-the-data-in-the-blender-and-let-God-sort-em-out variety will do the job. Instead we need some structure and some prior information.

Andrew Gelman

Causality in social sciences — and economics — can never solely be a question of statistical inference. Causality entails more than predictability, and to really in depth explain social phenomena require theory. Analysis of variation — the foundation of all econometrics — can never in itself reveal how these variations are brought about. First, when we are able to tie actions, processes or structures to the statistical relations detected, can we say that we are getting at relevant explanations of causation.

Most facts have many different, possible, alternative explanations, but we want to find the best of all contrastive (since all real explanation takes place relative to a set of alternatives) explanations. So which is the best explanation? Many scientists, influenced by statistical reasoning, think that the likeliest explanation is the best explanation. But the likelihood of x is not in itself a strong argument for thinking it explains y. I would rather argue that what makes one explanation better than another are things like aiming for and finding powerful, deep, causal, features and mechanisms that we have warranted and justified reasons to believe in. Statistical — especially the variety based on a Bayesian epistemology — reasoning generally has no room for these kinds of explanatory considerations. The only thing that matters is the probabilistic relation between evidence and hypothesis. That is also one of the main reasons I find abduction — inference to the best explanation — a better description and account of what constitute actual scientific reasoning and inferences.

In the social sciences … regression is used to discover relationships or to disentangle cause and effect. However, investigators have only vague ideas as to the relevant variables and their causal order; functional forms are chosen on the basis of convenience or familiarity; serious problems of measurement are often encountered.

Regression may offer useful ways of summarizing the data and making predictions. Investigators may be able to use summaries and predictions to draw substantive conclusions. However, I see no cases in which regression equations, let alone the more complex methods, have succeeded as engines for discovering causal relationships.

David Freedman

Some statisticians and data scientists think that algorithmic formalisms somehow give them access to causality. That is, however, simply not true. Assuming ‘convenient’ things like faithfulness or stability is not to give proofs. It’s to assume what has to be proven. Deductive-axiomatic methods used in statistics do no produce evidence for causal inferences. The real causality we are searching for is the one existing in the real world around us. If there is no warranted connection between axiomatically derived theorems and the real world, well, then we haven’t really obtained the causation we are looking for.

If contributions made by statisticians to the understanding of causation are to be taken over with advantage in any specific field of inquiry, then what is crucial is that the right relationship should exist between statistical and subject-matter concerns …
introduction-to-statistical-inferenceThe idea of causation as consequential manipulation is apt to research that can be undertaken primarily through experimental methods and, especially to ‘practical science’ where the central concern is indeed with ‘the consequences of performing particular acts’. The development of this idea in the context of medical and agricultural research is as understandable as the development of that of causation as robust dependence within applied econometrics. However, the extension of the manipulative approach into sociology would not appear promising, other than in rather special circumstances … The more fundamental difficulty is that​ under the — highly anthropocentric — principle of ‘no causation without manipulation’, the recognition that can be given to the action of individuals as having causal force is in fact peculiarly limited.

John H. Goldthorpe

To save Europe we have to abandon the euro

23 Jun, 2018 at 12:30 | Posted in Economics | 9 Comments

The euro crisis is far from over. The tough austerity measures imposed in the eurozone has made economy after economy contract. And it has not only made things worse in the periphery countries, but also in countries like France and Germany. Alarming facts that should be taken seriously.

The problems — created to a large extent by the euro — may not only endanger our economies, but also our democracy itself. How much whipping can democracy take? How many more are going to get seriously hurt and ruined before we end this madness and scrap the euro?

The ‘European idea’—or better: ideology—notwithstanding, the euro has split Europe in two. As the engine of an ever-closer union the currency’s balance sheet has been disastrous. Norway and Switzerland will not be joining the eu any time soon; Britain is actively considering leaving it altogether. Sweden and Denmark were supposed to adopt the euro at some point; that is now off the table. The Eurozone itself is split between surplus and deficit countries, North and South, Germany and the rest. At no point since the end of World War Two have its nation-states confronted each other with so much hostility; the historic achievements of European unification have never been so threatened …

Anyone wishing to understand how an institution such as the single currency can wreak such havoc needs a concept of money that goes beyond that of the liberal economic tradition and the sociological theory informed by it. The conflicts in the Eurozone can only be decoded with the aid of an economic theory that can conceive of money not merely as a system of signs that symbolize claims and contractual obligations, but also, in tune with Weber’s view, as the product of a ruling organization, and hence as a contentious and contested institution with distributive consequences full of potential for conflict …

NLR95coverNow more than ever there is a grotesque gap between capitalism’s intensifying reproduction problems and the collective energy needed to resolve them … This may mean that there is no guarantee that the people who have been so kind as to present us with the euro will be able to protect us from its consequences, or will even make a serious attempt to do so. The sorcerer’s apprentices will be unable to let go of the broom with which they aimed to cleanse Europe of its pre-modern social and anti-capitalist foibles, for the sake of a neoliberal transformation of its capitalism. The most plausible scenario for the Europe of the near and not-so-near future is one of growing economic disparities—and of increasing political and cultural hostility between its peoples, as they find themselves flanked by technocratic attempts to undermine democracy on the one side, and the rise of new nationalist parties on the other. These will seize the opportunity to declare themselves the authentic champions of the growing number of so-called losers of modernization, who feel they have been abandoned by a social democracy that has embraced the market and globalization.

Wolfgang Streeck

Great article — and it actually confirms what Wynne Godley wrote more than twenty years ago:

If a government stops having its own currency, it doesn’t just give up “control over monetary policy” as normally understood; its spending powers also become constrained in an entirely new way. If a government does not have its own central bank on which it can draw cheques freely, its expenditures can be financed only by borrowing in the open market in competition with businesses, and this may prove excessively expensive or even impossible, particularly under “conditions of extreme emergency.” greece-feb12-bank__3197265k If Europe is not to have a full-scale budget of its own under the new arrangements it will still have, by default, a fiscal stance of its own made up of the individual budgets of component states. The danger, then, is that the budgetary restraint to which governments are individually committed will impart a disinflationary bias that locks Europe as a whole into a depression it is powerless to lift.

Wynne Godley

Next Page »

Blog at WordPress.com.
Entries and Comments feeds.