Moments you never forget (personal)

26 Apr, 2018 at 10:13 | Posted in Varia | Comments Off on Moments you never forget (personal)

68

Courage is a capability to confront fear, as when in front of the powerful and mighty, not to step back, but stand up for one’s rights not to be humiliated or abused.

Courage is to do the right thing in spite of danger and fear. To keep on even if opportunities to turn back are given.

Dignity, a better life, or justice and rule of law, are things worth fighting for. Not to step back — in spite of confronting the mighty and powerful — creates courageous acts that stay in our memories and means something — as the political demonstration conducted by Tommie Smith and John Carlos at the 1968 Summer Olympics in Mexico City.

I was ten years old at the time and following it all on TV. Moments like that you never forget. It has stayed with me for all these years.

DSGE models — false by construction

25 Apr, 2018 at 18:05 | Posted in Economics, Statistics & Econometrics | Comments Off on DSGE models — false by construction

Advances in mathematical tools and in economic theory rapidly changed the landscape. From the perspective of macroeconomics, the streamlined DSGE models of the 1980s begot much richer models in the 1990s. One remarkably successful extension was the introduction​ of nominal and real rigidities, i.e., the conception that agents cannot immediately adjust to changes in the economic environment. In particular, many of the new DSGE models focused on studying the consequences of limitations in how frequently or how easily agents can changes prices and wages (prices and wages are “sticky”). Since the spirit of these models seemed to capture the tradition of Keynesian economics, they quickly became known as New Keynesian DSGE models (Woodford, 2003) …

londonbridgefallingdown

One of the features of the New Macroeconometrics that the readers of this volume will find exciting is that the (overwhelming?) majority of it is done from an explicitly Bayesian perspective​ …

The Bayesian approach deals in a transparent way with misspecification and identification problems, which are pervasive in the estimation of DSGE models … After all, a DSGE model is a very stylized and simplified view of the economy that focuses only on the most important mechanisms at play. Hence, the model is false by construction​ and we need to keep this notion constantly in view …

Jesús Fernández-Villaverde et al.

Keep that “constantly in view.”

Hmm.

Seems some economists have a bad memory.

True Bromance

25 Apr, 2018 at 15:45 | Posted in Economics, Varia | Comments Off on True Bromance

 

The Lucas critique comes back with a vengeance in DSGE models

25 Apr, 2018 at 10:08 | Posted in Economics, Statistics & Econometrics | 3 Comments

Both approaches to DSGE macroeconometrics (VAR and Bayesian) have evident vulnerabilities, which substantially derive from how parameters are handled in the technique. In brief, parameters from formally elegant models are calibrated in order to obtain simulated values that reproduce some stylized fact and/or some empirical data distribution, thus relating the underlying theoretical model and the observational data. But there are at least three main respects in which this practice fails.

lucasFirst of all, DSGE models have substantial difficulties in taking account of many important mechanisms that actually govern real economies, for example, institutional constraints like the tax system, thereby reducing DSGE power in policy analysis … In the attempt to deal with this serious problem, various parameter constraints on the model policy block are provided. They derive from institutional analysis and reflect policymakers’ operational procedures. However such model extensions, which are intended to reshape its predictions to reality and to deal with the underlying optimization problem, prove to be highly unflexible, turning DSGE into a “straitjacket tool” … In particular, the structure imposed on DSGE parameters entails various identification problems, such as observational equivalence, underidentification, and partial and weak identification.

These problems affect both empirical DSGE approaches. Fundamentally, they are ascribable to the likelihoods to estimate. In fact, the range of structural parameters that generate impulse response functions and data distributions fitting very close to the true ones does include model specifications that show very different features and welfare properties. So which is the right model specification (i.e., parameter set) to choose? As a consequence, reasonable estimates do not derive from the informative contents of models and data, but rather from the ancillary restrictions that are necessary to make the likelihoods informative, which are often arbitrary. Thus, after the Lucas’s super-exogeneity critique has been thrown out the door, it comes back through the window.

Roberto Marchionatti & Lisa Sella

Our admiration for technical virtuosity should not blind us to the fact that we have to have a cautious attitude towards probabilistic inferences in economic contexts. We should look out for causal relations, but econometrics can never be more than a starting point in that endeavour since econometric (statistical) explanations are not explanations in terms of mechanisms, powers, capacities or causes. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality, But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes. A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

This is a more fundamental and radical problem than the celebrated ‘Lucas critique’ have suggested. This is not the question if deep parameters, absent on the macro-level, exist in ‘tastes’ and ‘technology’ on the micro-level. It goes deeper. Real world social systems are not governed by stable causal mechanisms or capacities.

The kinds of laws and relations that econom(etr)ics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real-world social systems they mostly do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made ‘nomological machines’ they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics rather useless.

Both the ‘Lucas critique’ and the ‘Keynes’ critique’ of econometrics argued that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different ‘variables’ was not enough. If they could not get at the causal structure that generated the data, they were not really ‘identified’. Lucas himself drew the conclusion that the problem with unstable relations was to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies. As yours truly has argued in a couple of posts — e. g. here and here — this, however, is a dead end.

The loanable funds fallacy

24 Apr, 2018 at 14:26 | Posted in Economics | 2 Comments

The loanable funds theory is in many regards nothing but an approach where the ruling rate of interest in society is — pure and simple — conceived as nothing else than the price of loans or credits set by banks and determined by supply and demand — as Bertil Ohlin put it — “in the same way as the price of eggs and strawberries on a village market.”

loanIt is a beautiful fairy tale, but the problem is that banks are not barter institutions that transfer pre-existing loanable funds from depositors to borrowers. Why? Because, in the real world, there simply are no pre-existing loanable funds. Banks create new funds — credit — only if someone has previously got into debt! Banks are monetary institutions, not barter vehicles.

In the traditional loanable funds theory — as presented in mainstream macroeconomics textbooks — the amount of loans and credit available for financing investment is constrained by how much saving is available. Saving is the supply of loanable funds, investment is the demand for loanable funds and assumed to be negatively related to the interest rate. Lowering households’ consumption means increasing savings via a lower interest.

That view has been shown to have very little to do with reality. It’s nothing but an otherworldly neoclassical fantasy. But there are many other problems as well with the standard presentation and formalization of the loanable funds theory:

As already noticed by James Meade decades ago, the causal story told to explicate the accounting identities used gives the picture of “a dog called saving wagged its tail labelled investment.” In Keynes’s view — and later over and over again confirmed by empirical research — it’s not so much the interest rate at which firms can borrow that causally determines the amount of investment undertaken, but rather their internal funds, profit expectations and capacity utilization.

As is typical of most mainstream macroeconomic formalizations and models, there is pretty little mention of real-world phenomena, like e. g. real money, credit rationing and the existence of multiple interest rates, in the loanable funds theory. Loanable funds theory essentially reduces modern monetary economies to something akin to barter systems — something they definitely are not. As emphasized especially by Minsky, to understand and explain how much investment/loaning/crediting is going on in an economy, it’s much more important to focus on the working of financial markets than staring at accounting identities like S = Y – C – G. The problems we meet on modern markets today have more to do with inadequate financial institutions than with the size of loanable-funds-savings.

The loanable funds theory in the ‘New Keynesian’ approach means that the interest rate is endogenized by assuming that Central Banks can (try to) adjust it in response to an eventual output gap. This, of course, is essentially nothing but an assumption of Walras’ law being valid and applicable, and that a fortiori the attainment of equilibrium is secured by the Central Banks’ interest rate adjustments. From a realist Keynes-Minsky point of view, this can’t be considered anything else than a belief resting on nothing but sheer hope. [Not to mention that more and more Central Banks actually choose not to follow Taylor-like policy rules.] The age-old belief that Central Banks control the money supply has more an more come to be questioned and replaced by an ‘endogenous’ money view, and I think the same will happen to the view that Central Banks determine “the” rate of interest.

A further problem in the traditional loanable funds theory is that it assumes that saving and investment can be treated as independent entities. This is seriously wrong:

gtThe classical theory of the rate of interest [the loanable funds theory] seems to suppose that, if the demand curve for capital shifts or if the curve relating the rate of interest to the amounts saved out of a given income shifts or if both these curves shift, the new rate of interest will be given by the point of intersection of the new positions of the two curves. But this is a nonsense theory. For the assumption that income is constant is inconsistent with the assumption that these two curves can shift independently of one another. If either of them shifts​, then, in general, income will change; with the result that the whole schematism based on the assumption of a given income breaks down … In truth, the classical theory has not been alive to the relevance of changes in the level of income or to the possibility of the level of income being actually a function of the rate of the investment.

There are always (at least) two parts in an economic transaction. Savers and investors have different liquidity preferences and face different choices — and their interactions usually only take place intermediated by financial institutions. This, importantly, also means that there is no ‘direct and immediate’ automatic interest mechanism at work in modern monetary economies. What this ultimately boils done to is — iter — that what happens at the microeconomic level — both in and out of equilibrium —  is not always compatible with the macroeconomic outcome. The fallacy of composition (the ‘atomistic fallacy’ of Keynes) has many faces — loanable funds is one of them.

Contrary to the loanable funds theory, finance in the world of Keynes and Minsky precedes investment and saving. Highlighting the loanable funds fallacy, Keynes wrote in “The Process of Capital Formation” (1939):

Increased investment will always be accompanied by increased saving, but it can never be preceded by it. Dishoarding and credit expansion provides not an alternative to increased saving, but a necessary preparation for it. It is the parent, not the twin, of increased saving.

What is ‘forgotten’ in the loanable funds theory, is the insight that finance — in all its different shapes — has its own dimension, and if taken seriously, its effect on an analysis must modify the whole theoretical system and not just be added as an unsystematic appendage. Finance is fundamental to our understanding of modern economies, and acting like the baker’s apprentice who, having forgotten to add yeast to the dough, throws it into the oven afterwards, simply isn’t enough.

All real economic activities nowadays depend on a functioning financial machinery. But institutional arrangements, states of confidence, fundamental uncertainties, asymmetric expectations, the banking system, financial intermediation, loan granting processes, default risks, liquidity constraints, aggregate debt, cash flow fluctuations, etc., etc. — things that play decisive roles in channelling money/savings/credit — are more or less left in the dark in modern formalizations of the loanable funds theory.

It should be emphasized that the equality between savings and investment … will be valid under all circumstances.kalecki In particular, it will be independent of the level of the rate of interest which was customarily considered in economic theory to be the factor equilibrating the demand for and supply of new capital. In the present conception investment, once carried out, automatically provides the savings necessary to finance it. Indeed, in our simplified model, profits in a given period are the direct outcome of capitalists’ consumption and investment in that period. If investment increases by a certain amount, savings out of profits are pro tanto higher …

One important consequence of the above is that the rate of interest cannot be determined by the demand for and supply of new capital because investment ‘finances itself.’

So, yes, the ‘secular stagnation’ will be over, as soon as we free ourselves from the loanable funds theory — and scholastic gibbering about ZLB — and start using good old Keynesian fiscal policies.

Scientific racism and the alt-right

24 Apr, 2018 at 13:14 | Posted in Politics & Society | Comments Off on Scientific racism and the alt-right

Welche Rolle Gene beim IQ spielen, kann man untersuchen, indem man identische Zwillinge findet, die bei ihrer Geburt getrennt wurden und getrennt voneinander groß wurden. Es gibt nur wenige untersuchte Fälle, in denen Zwillinge in unterschiedlichen Familien aufgewachsen sind, die gleichzeitig zu verschiedenen sozialen Schichten mit abweichendem Bildungsniveau zählten. Untersuchungen zeigten hier deutliche Unterschiede in den IQ-Werten – in einem Fall lagen 20 IQ-Punkte zwischen den Zwillingen, in einem anderen sogar 29 …

PinocchioDie Erforschung von Adoptionen bestätigt diesen Eindruck. Wer statt einzelnen Personen ganze Populationen betrachtet, entdeckt ein ähnliches Muster.

Der bedeutendste IQ-Theoretiker der vergangenen 50 Jahre ist der Neuseeländer James Flynn. Er fand heraus, dass die IQ-Tests mit jeder Generation anspruchsvoller werden müssen, wenn ein Schnitt von 100 erhalten bleiben soll … Er stellte fest, dass die durchschnittlichen IQ-Werte im Jahr 1900, gemessen an den heutigen Standards, bei etwa 70 lägen.

Was sich geändert hat, hat nichts mit Genetik zu tun. Stattdessen sind Menschen heute häufiger mit abstrakter Logik konfrontiert, die in den IQ-Tests gemessen wird. Manche Bevölkerungsgruppen begegnen ihr häufiger als andere, was auch erklärt, warum ihre IQ-Werte sich voneinander unterscheiden. Flynn zeigte, dass sich die unterschiedlichen Durchschnittswerte von Populationen komplett durch äußere Einflüsse erklären lassen …

Trotz der überwältigenden Beweise gegen sie bleibt die Rassenlehre ein fester Bestandteil der Ansichten der US-amerikanischen Alt-Right, die sie als politischen Rammbock immer wieder für ihre Kleinstaaten-Agenda einsetzen. Wer glaubt, die Armen seien arm, weil sie dumm geboren worden seien, der braucht auch nicht viel Fantasie, um die These auf ganze Bevölkerungsgruppen zu erweitern, die von Armut betroffen sind.

Gavin Evans / Die Zeit

Finland ends basic income experiment

24 Apr, 2018 at 12:04 | Posted in Economics | Comments Off on Finland ends basic income experiment

Europe’s first national government-backed experiment in giving citizens free cash will end next year after Finland decided not to extend its widely publicised basic income trial and to explore alternative welfare schemes instead.

universal_basic_incomeSince January 2017, a random sample of 2,000 unemployed people aged 25 to 58 have been paid a monthly €560 (£475), with no requirement to seek or accept employment …

The scheme – aimed primarily at seeing whether a guaranteed income might incentivise people to take up paid work by smoothing out gaps in the welfare system – is strictly speaking not a universal basic income (UBI) trial, because the payments are made to a restricted group and are not enough to live on.

But it was hoped it would shed light on policy issues such as whether an unconditional payment might reduce anxiety among recipients and allow the government to simplify a complex social security system that is struggling to cope with a fast-moving and insecure labour market …

The idea of UBI – appealing both to the left, which hopes it can cut poverty and inequality, and to the right, which sees it as a possible route to a leaner, less bureaucratic welfare system – has gained traction recently amid predictions that automation could threaten up to a third of current jobs.

Jon Henley/The Guardian

Tractability hoax redux​

23 Apr, 2018 at 18:32 | Posted in Economics | 2 Comments

A ‘tractable’ model is one that you can solve, which means there are several types of tractability:​ analytical tractability (finding a solution to a theoretical model), empirical tractability (being able to estimate/calibrate your model) and computational tractability (finding numerical solutions). It is sometimes hard to discriminate between theoretical and empirical, or empirical and computational tractability …

canopenrWhat I’d like to capture is the effect of those choices economists make “for convenience,” to be able to reach solutions, to simplify, to ease their work, in short, to make a model tractable. While those assumptions are conventional and meant to be lifted as mathematical, theoretical and empirical skills and technology (hardware and software) ‘progress,’ their underlying rationale is often lost as they are taken up by other researchers, spread, and become standard (implicit in the last sentence is the idea that what a tractable model is evolves as new techniques and technologies are brought in) …

The tractability lens also helps me make sense of what is happening in economics now, and what might come next.  Right now, clusters of macroeconomists are each working on relaxing one or two tractability assumptions: research agendas span heterogeneity, non-rational expectations, financial markets, non-linearities, fat-tailed distributions, etc. But if you put all these adds-on together (assuming you can design a consistent model, and that adds-on are the way forward, which many critics challenge), you’re back to non-tractable. So what is the priority? How do macroeconomists rank these model improvements? And can the profession afford​ waiting 30 more years, 3 more financial crises and two trade wars before it can finally say it has a model rich enough to anticipate crises?

Beatrice Cherrier

Important questions that serious economists ought to ask themselves. Using ‘simplifying’ tractability assumptions — rational expectations, common knowledge, representative agents, linearity, additivity, ergodicity, etc — because otherwise they cannot ‘manipulate’ their models or come up with ‘rigorous ‘ and ‘precise’ predictions and explanations, does not exempt economists from having to justify their modelling choices. Being able to ‘manipulate’ things in models cannot per se be enough to warrant a methodological choice. If economists — as Cherrier conjectures — do not think their tractability assumptions make for good and realist models, it is certainly a just question to ask for clarification of the ultimate goal of the whole modelling endeavour.

Take for example the ongoing discussion on rational expectations as a modelling assumption. Those who want to build macroeconomics on microfoundations usually maintain that the only robust policies are those based on rational expectations and representative actors models. As yours truly has tried to show in On the use and misuse of theories and models in mainstream economics there is really no support for this conviction at all. If microfounded macroeconomics has nothing to say about the real world and the economic problems out there, why should we care about it? The final court of appeal for macroeconomic models is not if we — once we have made our tractability assumptions — can ‘manipulate’ them, but the real world. And as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than hand-waving that give us rather a little warrant for making inductive inferences from models to real-world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

Browning et al. (1999, p. 545) recognize in regard to DSGEs that “the microeconomic evidence is often incompatible with the macroeconomic model being calibrated.” These authors highlight three main criticalities feeding the micro–macro gap and that DSGEs largely neglect for reasons of computational tractability: heterogeneity, which empirical evidence widely documents, in preferences, constraints, and skills; uncertainty, for it is fundamental to distinguish between micro and macro uncertainty, and to deduce it from measurement error and model misspecification; and the synthesis of the micro evidence, since a plethora of micro studies often implies very different assumptions that prevent the (estimated) parameters from fitting any kind of context.

Finally, worth recalling is the problem of the intertemporal inconsistency of the rational expectations hypothesis with unanticipated structural breaks. The empirical importance of this point is very evident on considering the effects of the latest financial and economic crisis.

Roberto Marchionatti & Lisa Sella

Manifeste contre le nouvel antisémitisme

22 Apr, 2018 at 12:42 | Posted in Politics & Society | 1 Comment

Le-Nouvel-Antisémitisme-en-France-L’antisémitisme n’est pas l’affaire des Juifs, c’est l’affaire de tous. Les Français, dont on a mesuré la maturité démocratique après chaque attentat islamiste, vivent un paradoxe tragique. Leur pays est devenu le théâtre d’un antisémitisme meurtrier. Cette terreur se répand, provoquant à la fois la condamnation populaire et un silence médiatique que la récente marche blanche a contribué à rompre …

La dénonciation de l’islamophobie – qui n’est pas le racisme anti-Arabe à combattre – dissimule les chiffres du ministère de l’Intérieur : les Français juifs ont 25 fois plus de risques d’être agressés que leurs concitoyens musulmans. 10 % des citoyens juifs d’Ile-de-France – c’est-à-dire environ 50 000 personnes – ont récemment été contraints de déménager parce qu’ils n’étaient plus en sécurité dans certaines cités et parce que leurs enfants ne pouvaient plus fréquenter l’école de la République …

Pourquoi ce silence ? Parce que la radicalisation islamiste – et l’antisémitisme qu’il véhicule – est considérée exclusivement par une partie des élites françaises comme l’expression d’une révolte sociale, alors que le même phénomène s’observe dans des sociétés aussi différentes que le Danemark, l’Afghanistan, le Mali ou l’Allemagne… Parce qu’au vieil antisémitisme de l’extrême droite, s’ajoute l’antisémitisme d’une partie de la gauche radicale qui a trouvé dans l’antisionisme l’alibi pour transformer les bourreaux des Juifs en victimes de la société. Parce que la bassesse électorale calcule que le vote musulman est dix fois supérieur au vote juif …

Nous attendons de l’islam de France qu’il ouvre la voie. Nous demandons que la lutte contre cette faillite démocratique qu’est l’antisémitisme devienne cause nationale avant qu’il ne soit trop tard. Avant que la France ne soit plus la France.

La liste des signataires:
Charles Aznavour … Elisabeth de Fontenay … Nicolas Sarkozy … Pascal Bruckner … Gérard Depardieu … Carla Bruni … Bernard-Henri Lévy … Julia Kristeva … Luc Ferry … Alain Finkielkraut … Danielle Cohen-Levinas …

Le Parisien

A new paradigm for teaching economics

22 Apr, 2018 at 12:08 | Posted in Economics | 1 Comment

CORE-Team_pb_Proof.inddDon’t let the students know! What we teach them in our intro classes bears little resemblance to how we do economics ourselves …

The new paradigm not only provides a more convincing story about how an economy might reach a competitive equilibrium, it also fundamentally alters the nature of that outcome. When lenders and borrowers, and employers and employees, are modelled as principals and agents with asymmetric information, who interact under an incomplete contract, credit and labour markets do not clear in competitive equilibrium …

The gap between concerns about major economic problems that bring students to our classrooms, and the topics we teach, is a second motivation for the CORE Project. During the past four years we have asked, in classrooms around the world: “what is the most pressing problem that economists should address?” The word cloud below shows what students at the Humboldt University in Berlin told us:

COREfig1

Word clouds from students in Sydney, London and Bogota are barely distinguishable from Berlin … Even more remarkable, in 2016 we asked the same question to new recruits – mostly economics graduates – at the Bank of England, and professional economists and other staff at the New Zealand Treasury and Reserve Bank. Both responded with a similar concern about inequality. Word clouds from France gave greater prominence to unemployment. All of them highlight climate change and environmental problems, automation, and financial instability.

Samuel Bowles & Wendy Carlin

La Chine — un défi adressé aux théories économiques

22 Apr, 2018 at 10:28 | Posted in Economics | Comments Off on La Chine — un défi adressé aux théories économiques

Avec l’effondrement de l’Union Soviétique, nombre d’intellectuels avaient anticipé une fin de l’histoire : marché et démocratie allaient remplacer le Gosplan et la domination du Parti Communiste. Depuis 1989, les régimes démocratiques se sont diffusés sur la plupart des continents et la logique du marché semble dominer les choix politiques que, dans le passé, mettaient en œuvre les gouvernements. Les années 2010 marquent cependant un infléchissement car se multiplient les régimes autoritaires qui n’ont plus que de lointaines similitudes avec l’idéal de la démocratie. Simultanément, dans l’ordre économique, un nombre croissant de gouvernements revendiquent une reprise de contrôle du processus d’internationalisation.

chineLa trajectoire russe témoigne de l’échec du processus de démocratisation comme préliminaire à la modernisation économique et la trajectoire chinoise invalide le pronostic qui ferait de la démocratie le régime politique nécessaire à la performance économique. On trouve, même en Chine, une justification d’un pouvoir centralisé : les défis seraient si nombreux et l’urgence telle que les délibérations propres à la démocratie ne permettraient pas d’y répondre. La multiplicité des investissements chinois à l’étranger actualise la possibilité d’une alternative au Consensus de Washington. Il est donc devenu essentiel de cerner les ressorts mais aussi les faiblesses qui sous- tendent le dynamisme de la Chine …

Les Etats-Unis ne sont plus la référence incontestée dans l’organisation des sociétés contemporaines. Après l’effondrement de l’Union Soviétique, vint le temps du Japon et aujourd’hui la Chine est perçue comme représentant une alternative, au point de susciter l’idée d’un consensus de Pékin. Développement accéléré et succès économique, doutes sur les vertus de la démocratie, montée en régime d’une puissance scientifique sont autant d’atouts aux yeux de divers gouvernements tentés par la verticalité du pouvoir politique. En fait, ce « modèle » repose sur la puissance d’une économie continentale, le rôle d’un parti-Etat et l’inscription dans une longue tradition d’exercice du pouvoir, autant de caractéristiques qui hypothèquent sa diffusion. La faiblesse des autres nations nourrit son expansion internationale mais aussi les sources de dépendance, peu favorable au développement. Enfin nombre de tensions et déséquilibres traversent la société chinoise au point de susciter la recherche d’un autre régime socio-économique. Dès lors la leçon chinoise est sans doute que chaque société doit inscrire sa stratégie dans l’histoire longue et que tout modèle finit par rencontrer ses limites.

Robert Boyer/Le Monde

L’antisémitisme en France

22 Apr, 2018 at 09:49 | Posted in Politics & Society | Comments Off on L’antisémitisme en France

FIGAROVOX. – Un an après le meurtre de Sarah Halimi, nous apprenons la mort de Mireille Knoll dans des circonstances similaires. Que cela nous dit-il de la situation de l’antisémitisme en France?

knoll

Alexis LACROIX. – Que la cote d’alerte est atteinte. Et que le déni vertueux a assez duré. Déni vertueux, oui, car, sous prétexte de ne stigmatiser personne, nous avons collectivement ajourné le moment de nommer l’ennemi. Et l’ennemi, c’est, comme l’a dit le président de la République lors de son hommage à Arnaud Beltrame, l’islamisme – un islamisme pas forcément «souterrain», mais en effet toujours «insidieux», qui mine des arpents entiers de notre République et y contrebat l’État de droit.

Les Frères musulmans, qui ne représentent aucunement les musulmans de France, excellent dans la pratique de l’infiltration idéologique. S’ils ne prêchent pas toujours ouvertement l’antisémitisme, leur militance est le terreau sur lequel celui-ci s’est développé, suivant la logique patiente, et comme machiavélienne, d’un travail de sape.

Les médias ont été plutôt discrets au moment de l’affaire Halimi. Pensez-vous qu’au lendemain des attentats de Trèbes, le meurtre de Mireille Knoll va susciter une indignation de masse?

La République a, trop longtemps, trop reculé. Il fallait réagir avec la gravité et la hauteur requises. La marche blanche du 28 mars, malgré les incidents qui l’ont émaillée, va dans le bon sens. Elle montre que, partout, en France, s’amorce une esquisse de prise de conscience. Des quatre coins du pays, au sein de milieux sociologiquement, politiquement, religieusement très divers, ces crimes sont – enfin! – tenus pour inacceptables.

Le Figaro

1968 — Die ganze Welt in Aufruhr

21 Apr, 2018 at 10:11 | Posted in Politics & Society | Comments Off on 1968 — Die ganze Welt in Aufruhr

Nicht nur ideenpolitisch war Amerika der große Speicher, aus dem sich die Gesellschaftskritik fast überall im Westen bediente. Auch hinsichtlich der Formen und Techniken des Protests wuchs dem »Land der unbegrenzten Möglichkeiten«, von dem man damals noch ohne viel Sarkasmus sprach, eine Vorreiterrolle zu. Die Sit-ins im Süden der USA, mit denen sich seit 1960 eine junge Generation von Schwarzen (ihrerseits unter Berufung auf den gewaltlosen Widerstand eines Mahatma Gandhi) gegen die nicht enden wollenden Apartheidstrukturen wehrte, prägten vier Jahre später auch in Berkeley das Bild: als an der kalifornischen Eliteuniversität das Free Speech Movement entstand – die, wie man zu Recht gesagt hat, »Mutter aller Studentenrevolten«.

womens-march-against-vietnam-war-P

Seit den Tagen und Nächten auf dem Campus von Berkeley entfaltete sich das Repertoire der Protestformen. Go-ins und Teach-ins gingen als Begriff und Praxis um den Globus, ebenso wie die bereits unter den Free-Speech-Aktivisten herangereifte Gewissheit, dass niemandem über dreißig zu trauen sei. Und spätestens seit dem Summer of Love in San Francisco (1967) gesellten sich zu den politischen auch die psychedelischen Happenings: die Be-ins, Love-ins und dergleichen mehr.

Diese Vermischung von Popkultur und Politik, die Entgrenzung des Politischen – im Deutschen unter der Parole »Alles ist politisch« –, gehörte zu den weltumspannenden Charakteristika der »unruhigen Jahre«. Sie war sowohl ein Produkt jener Mentalität der Nachkriegsjahrzehnte, von der in der Zeitgeschichtsschreibung inzwischen unter dem Stichwort Cold War Culture die Rede ist, als auch eine Antwort darauf.

Norbert Frei/Dei Zeit

The case for a new economics

20 Apr, 2018 at 19:04 | Posted in Economics | Comments Off on The case for a new economics

When the great crash hit a decade ago, the public realised that the economics profession was clueless …

After 10 years in the shadow of the crisis, the profession’s more open minds have recognised there is serious re-thinking to be done …

But the truth is that most of the “reforms” have been about adding modules to the basic template, leaving the core of the old discipline essentially intact. My view is that this is insufficient, and treats the symptoms rather than the underlying malaise …

RE-LogoIf we accept that we need fundamental reform, what should the new economics—“de-conomics” as I’m calling it—look like?

First, we need to accept that there is no such thing as “value-free” analysis of the economy. As I’ve explained, neoclassical economics pretends to be ethically neutral while smuggling in an individualistic, anti-social ethos …

Second, the analysis needs to be based around how human beings actually operate—rather than how neoclassicism asserts that “rational economic person (or firm)” should operate …

Third, we need to put the good life centre stage, rather than prioritising the areas that are most amenable to analysis via late-19th century linear mathematics. Technological progress and power relationships between firms, workers and governments need to be at the heart of economic discourse and research …

Finally, economics needs to be pluralistic. For the last half-century neoclassical economics has been gradually colonising other social science disciplines such as sociology and political science. It is high time this process reversed itself so that there was two-way traffic and a mutually beneficial learning exchange between disciplines. It is possible—and probably desirable—that the “deconomics” of the future looks more like psychology, sociology or anthropology than it does today’s arid economics …

The change I am seeking is no more fundamental than the transition from classical to neoclassical economics, and that was accomplished without the discipline imploding. And this time around we’ve got then-unimaginable data and other resources. So there can be no excuse for delay. Let economists free themselves of a misleading map, and then—with clear eyes—look at the world anew.

Howard Reed/Prospect Magazine

Mainstream economists are of course not überjoyed when confronted with this kind of critique. Diane Coyle’s reply to Reed in Prospect Magazine is typical.

Those of us in the economics community who are impolite enough to dare question the​ preferred methods and models applied in mainstream economics are as a rule met with disapproval. But although people seem to get very agitated and upset by the critique — just read the commentaries on this blog if you don’t believe me — defenders of “received theory” always say that the critique is “nothing new”, that they have always been “well aware” of the problems, and so on, and so on.

So, for the benefit of Diane Coyle and all other mindless practitioners of mainstream economic modeling who don’t want to be disturbed in their doings, David Freedman has put together a very practical list of vacuous responses to criticism that can be freely used to save their peace of mind:

We know all that. Nothing is perfect … The assumptions are reasonable. The assumptions don’t matter. The assumptions​ are conservative. You can’t prove the assumptions are wrong. The biases will cancel. We can model the biases. We’re only doing what everybody​ else does. Now we use more sophisticated techniques. If we don’t do it, someone else will. What would you do? The decision-maker has to be better off with us than without us … The models aren’t totally useless. You have to do the best you can with the data. You have to make assumptions in order to make progress. You have to give the models the benefit of the doubt. Where’s the harm?

The tractability hoax in modern economics

20 Apr, 2018 at 11:16 | Posted in Economics | Comments Off on The tractability hoax in modern economics

While the paternity of the theoretical apparatus underlying the new neoclassical synthesis in macro is contested, there is wide agreement that the methodological framework was largely architected by Robert Lucas … Bringing a representative agent meant foregoing the possibility to tackle inequality, redistribution and justice concerns. Was it deliberate? How much does this choice owe to tractability? What macroeconomists were chasing, in these years, was a renewed explanation of the business cycle. They were trying to write microfounded and dynamic models …

tractable-2Rational expectations imposed cross-equation restrictions, yet estimating these new models substantially raised the computing burden. Assuming a representative agent mitigated computational demands, and allowed macroeconomists to get away with general equilibrium aggregate issues: it made new-classical models analytically and computationally tractable …

Was tractability the main reason why Lucas embraced the representative agent (and market clearing)? Or could he have improved tractability through alternative hypotheses, leading to opposed policy conclusions? … Some macroeconomists may have endorsed the new class of Lucas-critique-proof models because they liked its policy conclusions. Other may have retained some hypotheses, then some simplifications, “because it makes the model tractable.” And while the limits of simplifying assumptions are often emphasized by those who propose them, as they spread, caveats are forgotten. Tractability restricts the range of accepted models and prevent economists from discussing some social issues, and with time, from even “seeing” them. Tractability ‘filters’ economists’ reality … The aggregate effect of “looking for tractable models” is unknown, and yet it is crucial to understand the current state of economics.

Beatrice Cherrier

Cherrier’s highly readable article underlines​ that the essence of mainstream​ (neoclassical) economic theory is its almost exclusive use of a deductivist methodology. A methodology that is more or less used without a smack of argument to justify its relevance.

The theories and models that mainstream economists construct describe imaginary worlds using a combination of formal sign systems such as mathematics and ordinary language. The descriptions made are extremely thin and to a large degree disconnected to the specific contexts of the targeted system than one (usually) wants to (partially) represent. This is not by chance. These closed formalistic-mathematical theories and models are constructed for the purpose of being able to deliver purportedly rigorous deductions that may somehow by be exportable to the target system. By analyzing a few causal factors in their “laboratories” they hope they can perform “thought experiments” and observe how these factors operate on their own and without impediments or confounders.

Unfortunately, this is not so. The reason for this is that economic causes never act in a socio-economic vacuum. Causes have to be set in a contextual structure to be able to operate. This structure has to take some form or other, but instead of incorporating structures that are true to the target system, the settings made in economic models are rather based on formalistic mathematical tractability. In the models they appear as unrealistic assumptions, usually playing a decisive role in getting the deductive machinery deliver “precise” and “rigorous” results. This, of course, makes exporting to real world target systems problematic, since these models – as part of a deductivist covering-law tradition in economics – are thought to deliver general and far-reaching conclusions that are externally valid. But how can we be sure the lessons learned in these theories and models have external validity​ when based on highly specific unrealistic assumptions? As a rule, the more specific and concrete the structures, the less generalizable the results. Admitting that we in principle can move from (partial) falsehoods in theories and models to truth in real​-world target systems do​ not take us very far​ unless a thorough explication of the relation between theory, model and the real world target system is made. If models assume representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. To have a deductive warrant for things happening in a closed model is no guarantee for them being preserved when applied to an open real-world target system.

Henry Louis Mencken once wrote that “there is always an easy solution to every human problem – neat, plausible and wrong.” And mainstream economics has indeed been wrong. Very wrong. Its main result, so far, has been to demonstrate the futility of trying to build a satisfactory bridge between formalistic-axiomatic deductivist models and real-world​d target systems. Assuming, for example, perfect knowledge, instant market clearing and approximating aggregate behaviour with unrealistically heroic assumptions of representative actors, just will not do. The assumptions made, surreptitiously eliminate the very phenomena we want to study: uncertainty, disequilibrium, structural instability and problems of aggregation and coordination between different individuals and groups.

The punch line is that most of the problems that mainstream economics is wrestling with, issues from its attempts at formalistic modelling per se of social phenomena. Reducing microeconomics to refinements of hyper-rational Bayesian deductivist models is not a viable way forward. It will only sentence to irrelevance the most interesting real-world​ economic problems. And as someone has so wisely remarked, murder is — unfortunately — the only way to reduce biology to chemistry – reducing macroeconomics to Walrasian general equilibrium microeconomics basically means committing the same crime.

If scientific progress in economics – as Robert Lucas and other latter days mainstream economists seem to think – lies in our ability to tell “better and better stories” without considering the realm of imagination and ideas a retreat from real-world target systems reality, one would, of course, think our economics journal being filled with articles supporting the stories with empirical evidence. However, I would argue that the journals still show a striking and embarrassing paucity of empirical studies that (try to) substantiate these theoretical claims. Equally amazing is how little one has to say about the relationship between the model and real-world target systems. It is as though thinking explicit discussion, argumentation and justification on the subject not required. Mainstream economic theory is obviously navigating in dire straits.

If the ultimate criteria for success of a deductivist system is to what extent it predicts and cohere with (parts of) reality, modern mainstream economics seems to be a hopeless misallocation of scientific resources. To focus scientific endeavours on proving things in models is a gross misapprehension of what an economic theory ought to be about. Deductivist models and methods disconnected from reality are not relevant to predict, explain or understand real-world economic target systems. These systems do not conform to the restricted closed-system structure the mainstream modelling strategy presupposes.

Mainstream economic theory still today consists mainly in investigating economic models. It has since long given up on the real world and contents itself with proving things about thought up worlds. Empirical evidence still only plays a minor role in mainstream economic theory, where models largely function as substitutes for empirical evidence.

What is wrong with mainstream economics is not that it employs models per se, but that it employs poor models. They are poor because they do not bridge to the real world target system in which we live. Hopefully humbled by the manifest failure of its theoretical pretences, the one-sided, almost religious, insistence on mathematical deductivist modelling as the only scientific activity worthy of pursuing in economics will give way to methodological pluralism based on ontological considerations rather than formalistic tractability.​​

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.