Facebook’s digital currency Libra — just for suckers or a global economic revolution?

14 Sep, 2019 at 19:04 | Posted in Economics | 1 Comment

zuckMit 20 Partnern hat Facebook die digitale Weltwährung “Libra” angekündigt … Mit einem Schlag kann das Konsortium alle nur denkbaren Netzwerkeffekte im digitalen Zahlungsbereich ausschöpfen. Damit ist Libra unmittelbar und überall präsent. Eine globale ökonomische Revolution.

Die Digitalökonomie folgt der Regel “The winner takes it all”. Wer die Netzwerkeffekte auf seiner Seite hat, schafft ein Quasi-Monopol. Facebook ist außerdem ein Meister nutzerfreundlicher Interfaces. Gleich mehrere Libra-Partner beherrschen die hier notwendige Massendatenverarbeitung. Es gibt hinreichend Erfahrung mit digitalen Währungen wie Bitcoin und der Nutzung von Blockchains und Messengerdiensten zur Geldübertragung. Erfahrung, Technologiekompetenz, weltweite Nutzerbasis, Lobby- und Kapitalmacht machen das Libra-Konsortium also außergewöhnlich stark.

Sarah Spiekermann

Interesting view. But obviously not all economists are convinced that Libra would be a blessing:

Every currency is based on confidence that the hard-earned dollars “deposited” into it will be redeemable on demand. The private banking sector has long shown that it is untrustworthy in this respect, which is why new prudential regulations have been necessary.

Speech bubble on blue background, Thumbs DownBut, in just a few short years, Facebook has earned a level of distrust that took the banking sector much longer to achieve. Time and again, Facebook’s leaders, faced with a choice between money and honoring their promises, have grabbed the money. And nothing could be more about money than creating a new currency. Only a fool would trust Facebook with his or her financial wellbeing. But maybe that’s the point: with so much personal data on some 2.4 billion monthly active users, who knows better than Facebook just how many suckers are born every minute?

Joseph Stiglitz

Do models make economics a science?

14 Sep, 2019 at 14:41 | Posted in Economics | 3 Comments

Well, if we are to believe most mainstream economists, models are what make economics a science.

economists_do_it_with_models_economics_humor_keychain-re3ab669c9dd84be1806dc76325a18fd0_x7j3z_8byvr_425In a Journal of Economic Literature review of Dani Rodrik’s Economics Rules, renowned game theorist Ariel Rubinstein discusses Rodrik’s justifications for the view that “models make economics a science.” Although Rubinstein has some doubts about those justifications — models are not indispensable for telling good stories or clarifying things in general; logical consistency does not determine whether economic models are right or wrong; and being able to expand our set of ‘plausible explanations’ doesn’t make economics more of a science than good fiction does — he still largely subscribes to the scientific image of economics as a result of using formal models that help us achieve ‘clarity and consistency’.

There’s much in the review I like — Rubinstein shows a commendable scepticism on the prevailing excessive mathematization​ of economics, and he is much more in favour of a pluralist teaching of economics than most other mainstream economists — but on the core question, “the model is the message,” I beg to differ with the view put forward by both Rodrik and Rubinstein.

Economics is more than any other social science model-oriented. There are many reasons for this — the history of the discipline, having ideals coming from the natural sciences (especially physics), the search for universality (explaining as much as possible with as little as possible), rigour, precision, etc.

Mainstream economists want to explain social phenomena, structures and patterns, based on the assumption that the agents are acting in an optimizing (rational) way to satisfy given, stable and well-defined goals.

The procedure is analytical. The whole is broken down into its constituent parts so as to be able to explain (reduce) the aggregate (macro) as the result of interaction of its parts (micro).

Modern mainstream (neoclassical) economists ground their models on a set of core assumptions (CA) — basically describing the agents as ‘rational’ actors — and a set of auxiliary assumptions (AA). Together CA and AA make up what might be called the ‘ur-model’ (M) of all mainstream neoclassical economic models. Based on these two sets of assumptions, they try to explain and predict both individual (micro) and — most importantly — social phenomena (macro).

The core assumptions typically consist of:

CA1 Completeness — rational actors are able to compare different alternatives and decide which one(s) he prefers

CA2 Transitivity — if the actor prefers A to B, and B to C, he must also prefer A to C.

CA3 Non-satiation — more is preferred to less.

CA4 Maximizing expected utility — in choice situations under risk (calculable uncertainty) the actor maximizes expected utility.

CA4 Consistent efficiency equilibria — the actions of different individuals are consistent, and the interaction between them results​ in an equilibrium.

When describing the actors as rational in these models, the concept of rationality used is instrumental rationality – choosing consistently the preferred alternative, which is judged to have the best consequences for the actor given his in the model exogenously given wishes/interests/goals. How these preferences/wishes/interests/goals are formed is typically not considered to be within the realm of rationality, and a fortiori not constituting part of economics proper.

The picture given by this set of core assumptions (rational choice) is a rational agent with strong cognitive capacity that knows what alternatives he is facing, evaluates them carefully, calculates the consequences and chooses the one — given his preferences — that he believes has the best consequences according to him.

Weighing the different alternatives against each other, the actor makes a consistent optimizing (typically described as maximizing some kind of utility function) choice ​and acts accordingly.

Beside​ the core assumptions (CA) the model also typically has a set of auxiliary assumptions (AA) spatio-temporally specifying the kind of social interaction between ‘rational actors’ that take place in the model. These assumptions can be seen as giving answers to questions such as

Continue Reading Do models make economics a science?…

Capital et idéologie — le nouveau livre de Thomas Piketty

13 Sep, 2019 at 16:58 | Posted in Economics | Leave a comment

capidQue trouve-t-on dans ce nouvel opus imposant de 1 200 pages ?

Il faut revenir sur ce qui a fait le succès du livre précédent : le constat empirique de la dynamique des inégalités sur une longue période. Le propos est à la fois plus historique – on remonte jusqu’au XVIIIe siècle – et plus large, couvrant de nombreux pays européens, en particulier la France et le Royaume-Uni, mais aussi les Etats-Unis, avec de longs passages sur l’Inde et la Chine, des excursions au Brésil, en Russie, en Iran et dans bien d’autres pays. Bref, une approche moins occidentalo-centrée que le précédent, qui portait essentiellement sur la France et les Etats-Unis.

La Révolution française n’a pas changé grand-chose à la concentration des richesses … La véritable révolution a lieu au cours du XXe siècle avec l’émergence d’une classe moyenne patrimoniale : les 10 % les plus riches perdent du poids au profit des 40 % qui suivent. Une bonne partie du livre est consacrée à expliquer les raisons de cette dynamique historique …

La partie la plus originale du livre propose une analyse socio-électorale des votes en fonction des niveaux de diplôme, de revenus et de patrimoine
Dans Capital et Idéologie, le chercheur n’a pas oublié son questionnement des années 1990 : la partie la plus originale du livre propose une analyse socio-électorale des votes en fonction des niveaux de diplôme, de revenus et de patrimoine. Il montre que les partis sociaux-démocrates en France, au Royaume-Uni, aux Etats-Unis et dans d’autres pays, aussi différents soient-ils, ont tous connu la même évolution : alors que des années 1950 à 1980, ils rassemblaient les votes des moins qualifiés et des plus pauvres, ils sont devenus le parti des plus diplômés.

Abandonnant les moins favorisés à leur sort, ils ont enfourché l’idéologie « propriétariste » célébrant le droit de propriété, s’appuyant sur sa dimension émancipatrice – tout le monde a le droit de posséder quelque chose et de bénéficier de la protection de l’Etat pour le conserver –, mais en oubliant son aspect inégalitaire, les plus riches accumulant sans limite. Plusieurs chapitres démontrent que c’est le retour d’une idée développée au cours du XIXe siècle.

Christian Chavagneux

Legends never die

12 Sep, 2019 at 16:38 | Posted in Varia | Leave a comment

 

The lack of positive results in econometrics

12 Sep, 2019 at 11:33 | Posted in Statistics & Econometrics | Leave a comment

For the sake of balancing the overly rosy picture of econometric achievements given in the usual econometrics textbooks today, it may be interesting to see how Trygve Haavelmo — with the completion (in 1958) of the twenty-fifth volume of Econometrica — assessed the role of econometrics in the advancement of economics. 

Haavelmo intro 2We have found certain general principles which would seem to make good sense. Essentially, these principles are based on the reasonable idea that, if an economic model is in fact “correct” or “true,” we can say something a priori about the way in which the data emerging from it must behave. We can say something, a priori, about whether it is theoretically possible to estimate the parameters involved. And we can decide, a priori, what the proper estimation procedure should be … But the concrete results of these efforts have often been a seemingly lower degree of accuracy of the would-be economic laws (i.e., larger residuals), or coefficients that seem a priori less reasonable than those obtained by using cruder or clearly inconsistent methods.

There is the possibility that the more stringent methods we have been striving to develop have actually opened our eyes to recognize a plain fact: viz., that the “laws” of economics are not very accurate in the sense of a close fit, and that we have been living in a dream-world of large but somewhat superficial or spurious correlations.

Since statisticians and econometricians have not been able to convincingly warrant their assumptions — homogeneity, stability, invariance, independence, additivity, and so on — as being ontologically isomorphic to real-world economic systems, there are still strong reasons to be critical of the econometric project. There are deep epistemological and ontological problems of applying statistical methods to a basically unpredictable, uncertain, complex, unstable, interdependent, and ever-changing social reality. Methods designed to analyse repeated sampling in controlled experiments under fixed conditions are not easily extended to an organic and non-atomistic world where time and history play decisive roles.

Econometric modelling should never be a substitute for thinking.

The general line you take is interesting and useful. It is, of course, not exactly comparable with mine. I was raising the logical difficulties. You say in effect that, if one was to take these seriously, one would give up the ghost in the first lap, but that the method, used judiciously as an aid to more theoretical enquiries and as a means of suggesting possibilities and probabilities rather than anything else, taken with enough grains of salt and applied with superlative common sense, won’t do much harm. I should quite agree with that. That is how the method ought to be used.

Keynes, letter to E.J. Broster, December 19, 1939

Necessary inventions …

10 Sep, 2019 at 09:19 | Posted in Varia | 3 Comments

pointless-inventions1

The quasi-peaceable gentleman of leisure, then, not only consumes of the staff of life beyond the minimum required for subsistence and physical efficiency, but his consumption also undergoes a specialisation as regards the quality of the goods consumed. He consumes freely and of the best, in food, drink, narcotics, shelter, services, ornaments, apparel, weapons and accoutrements, amusements, amulets, and idols or divinities.

Thorstein Veblen

It’s not just p = 0.048 vs. p = 0.052

9 Sep, 2019 at 17:37 | Posted in Statistics & Econometrics | Leave a comment

“[G]iven the realities of real-world research, it seems goofy to say that a result with, say, only a 4.8% probability of happening by chance is “significant,” while if the result had a 5.2% probability of happening by chance it is “not significant.” Uncertainty is a continuum, not a black-and-white difference” …

worshipMy problem with the 0.048 vs. 0.052 thing is that it way, way, way understates the problem.

Yes, there’s no stable difference between p = 0.048 and p = 0.052.

But there’s also no stable difference between p = 0.2 (which is considered non-statistically significant by just about everyone) and p = 0.005 (which is typically considered very strong evidence) …

If these two p-values come from two identical experiments, then the standard error of their difference is sqrt(2) times the standard error of each individual estimate, hence that difference in p-values itself is only (2.81 – 1.28)/sqrt(2) = 1.1 standard errors away from zero …

So. Yes, it seems goofy to draw a bright line between p = 0.048 and p = 0.052. But it’s also goofy to draw a bright line between p = 0.2 and p = 0.005. There’s a lot less information in these p-values than people seem to think.

So, when we say that the difference between “significant” and “not significant” is not itself statistically significant, “we are not merely making the commonplace observation that any particular threshold is arbitrary—for example, only a small change is required to move an estimate from a 5.1% significance level to 4.9%, thus moving it into statistical significance. Rather, we are pointing out that even large changes in significance levels can correspond to small, nonsignificant changes in the underlying quantities.”

Andrew Gelman

Les entreprises baignent dans un océan de dettes

9 Sep, 2019 at 15:41 | Posted in Economics | 1 Comment

Mercredi 4 septembre, l’entreprise la plus fortunée du monde, Apple, assise sur son trésor de 200 milliards de dollars (180 milliards d’euros), a jugé qu’il était temps d’emprunter un peu d’argent. Elle a émis pour sept milliards de dollars d’obligations, des titres de dettes, alors qu’elle ne sait manifestement pas quoi faire de son argent.

us-10yearAu total, les entreprises américaines ont émis, pendant cette seule première semaine de septembre 2019, pour près de 74 milliards de dollars d’obligations. Un record historique qui s’est propagé sur toutes les classes de dettes et sur toute la planète, avec un montant jamais vu de 150 milliards de dollars.

Du côté des acheteurs, la raison de cet engouement est la même que celle qui prévaut depuis que les banques centrales ont baissé drastiquement les taux des obligations d’Etats. Ce placement sans risque ne rapportant plus rien, voir coûtant de l’argent, les investisseurs se sont reportés sur les obligations des grandes entreprises qui sont un peu plus rémunératrices que les bons du Trésor américain, avec un risque faible. Contrairement à une action, une obligation fournit un rendement fixe et connu dès le départ …

Le stock de dette des entreprises aux Etats-Unis a doublé depuis 2008 et 50 % sont désormais logés dans des entreprises notées BBB par les agences, c’est-à-dire à un cran seulement au-dessus des « junk bonds », ces obligations à fort risque, sensibles aux retournements de conjoncture. Une bulle considérable s’est formée et l’aiguille qui la percera se rapproche.

Philippe Escande / Le Monde

Central bank independence — institutionalizing monetary handcuffs

8 Sep, 2019 at 12:23 | Posted in Economics | 2 Comments

hqb813cpgmtyImposing a hard target can bind the central bank, but the government must then act on failures to hit the target. Why would it if it is self-interested? If it does, that amounts to saying it is not selfish, which undermines the argument that independence is needed. The same argument can be used to deconstruct independence itself. Suppose independence is a solution to time inconsistency. Why would a selfish politician ever agree to independence in the first place? If they did, that would be tantamount to saying they are not selfish, in which case independence is not needed. In other words, only non-self-interested politicians choose independence, making independence redundant …

Even if the banker is honest, there still remains the fundamental question of why would selfish politicians go against their own interests and appoint a conservative independent central banker? Doing so is tantamount to proving they are not selfish, in which case there is no need for an independent central bank.
That microeconomic contradiction suggests something else is going on with the shift to central bank independence. By definition, selfish politicians cannot be authorizing it out of public interest. Instead, they are doing so out of self-interest, which is the clue to understanding the real reasons for the shift to central bank independence … That implies central bank independence is not the socially benevolent phenomenon mainstream economists and central bankers claim it to be. Instead, somewhat obviously, it is a highly political development serving partisan interests …

The real issues are why do independent banks go after inflation harder, and what is the role of independence?

The reason they go after inflation harder is they are aligned with capital. That is because capital is politically in charge and sets the goals for central banks. It is also because central bankers and their economic advisers have bought into the Chicago School monetary policy framework which implicitly sides with capital (i.e. views the problem as being inflation prone government). That explains why there is central bank independence …

Democratic countries may still decide to implement central bank independence, but that decision is a political one with non-neutral economic and political consequences. It is a grave misrepresentation to claim independence solves a fundamental public interest economic problem, and economists make themselves accomplices by claiming it does.

Thomas Palley

Kitchen sink econometrics

8 Sep, 2019 at 11:45 | Posted in Statistics & Econometrics | 1 Comment

When I present this argument … one or more scholars say, “But shouldn’t I control for everything I can in my regressions? If not, aren’t my coefficients biased due to excluded variables?” This argument is not as persuasive as it may seem initially. First of all, if what you are doing is misspecified already, then adding or excluding other variables has no tendency to make things consistently better or worse … The excluded variable argument only works if you are sure your specification is precisely correct with all variables included. But no one can know that with more than a handful of explanatory variables.
piled-up-dishes-in-kitchen-sinkStill more importantly, big, mushy linear regression and probit equations seem to need a great many control variables precisely because they are jamming together all sorts of observations that do not belong together. Countries, wars, racial categories, religious preferences, education levels, and other variables that change people’s coefficients are “controlled” with dummy variables that are completely inadequate to modeling their effects. The result is a long list of independent variables, a jumbled bag of nearly unrelated observations, and often a hopelessly bad specification with meaningless (but statistically significant with several asterisks!) results.

A preferable approach is to separate the observations into meaningful subsets—internally compatible statistical regimes … If this can’t be done, then statistical analysis can’t be done. A researcher claiming that nothing else but the big, messy regression is possible because, after all, some results have to be produced, is like a jury that says, “Well, the evidence was weak, but somebody had to be convicted.”

Christopher H. Achen

The empirical and theoretical evidence is clear. Predictions and forecasts are inherently difficult to make in a socio-economic domain where genuine uncertainty and unknown unknowns often rule the roost. The real processes that underly the time series that economists use to make their predictions and forecasts do not conform with the assumptions made in the applied statistical and econometric models. Much less is a fortiori predictable than standardly — and uncritically — assumed. The forecasting models fail to a large extent because the kind of uncertainty that faces humans and societies actually makes the models strictly seen inapplicable. The future is inherently unknowable — and using statistics, econometrics, decision theory or game theory, does not in the least overcome this ontological fact. The economic future is not something that we normally can predict in advance. Better then to accept that as a rule ‘we simply do not know.’

We could, of course, just assume that the world is ergodic and hence convince ourselves that we can predict the future by looking at the past. Unfortunately, economic systems do not display that property. So we simply have to accept that all our forecasts are fragile.

Trots det lever jag utan förtvivlan

8 Sep, 2019 at 09:52 | Posted in Varia | Leave a comment

lageOm den värld jag tillhör har jag inte något hopp. Undergången är inbyggd i systemet, ty vart varningsrop blir strax förlöjligat och därmed oskadliggjort.

Trots det lever jag utan förtvivlan. Det tillhör den enskildes villkor att leva vid katastrofens rand. Vägen fram till döden ter sig lika lång, var man än befinner sig i livet. Så är det också med mänsklighetens tillvaro.

Jag uppfattar den vemodsfyllda blick varmed unga människor ser på mig. Jag känner igen den. De vet inte att jag betraktar dem på samma sätt och mäter deras återstående liv i samma förgängelsekänslas tecken. Alla hör vi samman.

Olof Lagercrantz

The pretense-of-knowledge syndrome in economics

8 Sep, 2019 at 08:15 | Posted in Economics | 1 Comment

What does concern me about my discipline … is that its current core — by which I mainly mean the so-called dynamic stochastic general equilibrium approach — has become so mesmerized with its own internal logic that it has begun to confuse the precision it has achieved about its own world with the precision that it has about the real one …

While it often makes sense to assume rational expectations for a limited application to isolate a particular mechanism that is distinct from the role of expectations formation, this assumption no longer makes sense once we assemble the whole model. Agents could be fully rational with respect to their local environments and everyday activities, but they are most probably nearly clueless with respect to the statistics about which current macroeconomic models expect them to have full information and rational information.

pretenceThis issue is not one that can be addressed by adding a parameter capturing a little bit more risk aversion about macro-economic, rather than local, phenomena. The reaction of human beings to the truly unknown is fundamentally different from the way they deal with the risks associated with a known situation and environment … In realistic, real-time settings, both economic agents and researchers have a very limited understanding of the mechanisms at work. This is an order-of-magnitude less knowledge than our core macroeconomic models currently assume, and hence it is highly likely that the optimal approximation paradigm is quite different from current workhorses, both for academic andpolicy work. In trying to add a degree of complexity to the current core models, by bringing in aspects of the periphery, we are simultaneously making the rationality assumptions behind that core approach less plausible …

The challenges are big, but macroeconomists can no longer continue playing internal games. The alternative of leaving all the important stuff to the “policy”-typ and informal commentators cannot be the right approach. I do not have the answer. But I suspect that whatever the solution ultimately is, we will accelerate our convergence to it, and reduce the damage we do along the transition, if we focus on reducing the extent of our pretense-of-knowledge syndrome.

Ricardo J. Caballero

A great article that also underlines — especially when it comes to forecasting and implementing economic policies  — that the future is inherently unknowable, and using statistics, econometrics, decision theory or game theory, does not in the least overcome this ontological fact.

treatprobAccording to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but “rational expectations.” Keynes rather thinks that we base our expectations on the confidence or “weight” we put on different events and alternatives. To Keynes, expectations are a question of weighing probabilities by “degrees of belief,” beliefs that often have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents as modelled by “modern” social sciences. And often we “simply do not know.”

So why do economists, companies and governments continue with the expensive, but obviously worthless, activity of trying to forecast/predict the future?

A couple of years ago yours truly was interviewed by a public radio journalist working on a series on Great Economic ThinkersWe were discussing the monumental failures of the predictions-and-forecasts-business. But — the journalist asked — if these cocksure economists with their “rigorous” and “precise” mathematical-statistical-econometric models are so wrong again and again — why do they persist wasting time on it?

In a discussion on uncertainty and the hopelessness of accurately modelling what will happen in the real world — in M. Szenberg’s Eminent Economists: Their Life Philosophies — Nobel laureate Kenneth Arrow comes up with what is probably the most plausible reason:

It is my view that most individuals underestimate the uncertainty of the world. This is almost as true of economists and other specialists as it is of the lay public. To me our knowledge of the way things work, in society or in nature, comes trailing clouds of vagueness … Experience during World War II as a weather forecaster added the news that the natural world as also unpredictable. cloudsAn incident illustrates both uncer-tainty and the unwilling-ness to entertain it. Some of my colleagues had the responsi-bility of preparing long-range weather forecasts, i.e., for the following month. The statisticians among us subjected these forecasts to verification and found they differed in no way from chance. The forecasters themselves were convinced and requested that the forecasts be discontinued. The reply read approximately like this: ‘The Commanding General is well aware that the forecasts are no good. However, he needs them for planning purposes.’

Esther Duflo vs Elinor Ostrom

5 Sep, 2019 at 18:45 | Posted in Economics | Leave a comment

elinor-ostromWhile both authors subscribe to realism, they practise two types of realism. The realism supported by Duflo is akin to a naive ‘metrological realism’ … in which quantification is seen as merely mirroring reality within a margin of error, whereas Ostrom seems closer to critical realism and constructivism: the way we perceive and quantify reality is moulded by our cognitive maps and conventions. The rationales of the social scientist and of the economic actors are also distinctive. Whereas, Duflo underlines the objectivity and rightness of the scientist applying sound techniques – which contrasts with the lack of information and the restrained horizon of local actors – Ostrom emphasises the processual, bounded and interpretative rationality of both the researcher and the observed actors. This leads to diverging views and normative agendas regarding development, politics and economics.

esther_dufloDuflo sees development as the implementation and replication of expert-led fixes to provide basic goods for the poor who are often blinded by their exacting situation. It is a technical quest for certainty and optimal measures in a fairly static framework. For the Ostroms, there are no best practices, only a few architectonic principles to build locally resilient orders. They view development as a situated learning process under uncertainty …

In Duflo’s science-based ‘benevolent paternalism’, the experimental technique works as an ‘anti-politics machine’ … social goals being predefined and RCT outcomes settling ideally ambiguities and conflicts. Real-world politics – disregarding or instrumentalising RCTs – and institutions – resulting from social compromises instead of evidence – are thus often perceived as external disturbances and constraints to economic science and evidence-based policy. This depoliticising stance is at odds with the significance of political economy for the Ostroms, their emphasis on deliberation to co-construct the aspirations and agencies of communities. While Duflo and Banerjee are in line with a technocratic democracy, the Ostroms sustain a Tocquevillean democratic self-governance. For the latter, institutions emanating from democratic processes, far from being straitjackets, are the core of economic processes. They simultaneously constraint and enable human action.

Agnès Labrousse

Inequality and the future of captalism

5 Sep, 2019 at 17:23 | Posted in Economics | Leave a comment

 

Brexit fatigue

5 Sep, 2019 at 16:57 | Posted in Varia | Leave a comment

brexit
 
Jacob Rees-Mogg, Conservative MP and Leader of the House of Commons, slouching during the late-night debate on Brexit yesterday …

Next Page »

Blog at WordPress.com.
Entries and comments feeds.