La théorie du ruissellement 

20 Nov, 2021 at 13:35 | Posted in Economics | Leave a comment


Study proves trickle-down didn't trickle | PoliticsNC


20 Nov, 2021 at 11:15 | Posted in Varia | Leave a comment


What killed macroeconomics?

19 Nov, 2021 at 16:44 | Posted in Economics | 17 Comments

The COVID-19 pandemic impelled governments to fall back on “fiscal Keynesianism,” because there was no way that just increasing the quantity of money could lead to the reopening of businesses that were prevented by law from doing so. Fiscal Keynesianism in the big lockdown meant issuing Treasury payments to people prevented from working.risk vs uncertainty

But now that the economy has reopened, the practical rationale for monetary and fiscal expansion has disappeared. Mainstream financial commentators believe the economy will bounce back as if nothing had happened. After all, economies fall into foxholes no more often than individuals normally do. So, the time has come to tighten both monetary and fiscal policy, because continued expansion of either or both will lead only to a “surge in inflation.” We can all breathe a sigh of relief; the trauma is over, and normal life without unemployment will resume.

Monetary policy works in theory but not in practice; fiscal policy works in practice but not in theory. Fiscal Keynesianism is still a policy in search of a theory. Acemoglu, Laibson, and List supply a piece of the missing theory when they note that shocks are “hard to predict.” Keynes would have said they are impossible to predict, which is why he rejected the standard view that economies are cyclically stable in the absence of shocks (which is as useless as saying that leaves don’t flutter in the absence of wind).

The supply and demand models that first-year economics students are taught can illuminate the equilibrium path of the hairdressing industry but not of the economy as a whole. Macroeconomics is the child of uncertainty. Unless economists recognize the existence of inescapable uncertainty, there can be no macroeconomic theory, only prudential responses to emergencies.

Robert Skidelsky

Modern macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical and New ‘Keynesian’ — still follows an ‘as if’ logic of denying the existence of genuine uncertainty and treat variables as if drawn from a known ‘data-generating process’ with known probability distribution that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the ‘data-generating process’ — if we do not have the ‘true’ model — the whole edifice collapses. And of course, it has to. Who really honestly believes that we have access to this mythical Holy Grail, the data-generating process?

Modern macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30​% and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But as Keynes convincingly argued in his monumental Treatise on Probability (1921), this is not always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another — equally good — model the chance is perhaps somewhere around 40%. We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

In the end,​ this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only ‘rational’ one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

Some macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better — how much bigger chance that we do not lull us into the comforting thought that we know everything and that everything is measurable and we have everything under control — if instead, we could just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing — economic catastrophe.

Farliga fubickar i foliehatt

19 Nov, 2021 at 15:49 | Posted in Politics & Society | Leave a comment

Foliehatt skyddar inte mot covid. - HD

Med den pandemi vi befinner oss i idag är det här — världen över — den utan tvekan farligaste grupp av människor som traskar runt på vår planet. Ska vi få stopp på corona-epidemin måste politiker och beslutsfattare sluta dalta med dessa tickande pandemibomber. Om du inte vaccinerat dig har du inte på restauranger, barer, affärer, fotbollsarenor, popkoncerter, skolor och arbetsplatser att göra. Punkt slut!

My moral compass (personal)

18 Nov, 2021 at 19:49 | Posted in Varia | 1 Comment

Person mirror pensive Stock Photos - Page 1 : Masterfile

This is the advice I’ve always given my sons and daughters when they’ve come and asked for guidance when confronted with momentous moral issues in life:

Ask yourself — Can I do this and still like what I see in the mirror tomorrow when I wake up?

That moral compass has served me well for almost half a century now, and I’m sure it has also helped my youngsters to become the good and honest people they are.

More than economists

18 Nov, 2021 at 16:17 | Posted in Economics | 2 Comments

Veblen, Keynes, and Hirschman were more than economists because they practiced their economics from a standpoint outside the profession, using it to criticize not only the assumption of rational self-interest, but also the consequences of economists’ indifference to “preferences.” Veblen’s standpoint was explicitly religious; he was still of a believing generation. Keynes, too, was an ethicist. G.E. Moore’s Principia Ethica remained what he called his “religion under the surface.” Hirschman wanted a “moral social science” that would be continually sensitive to the ethical content of its analysis …

Prof. Lord Robert Skidelsky (C. 1953-58), OB of the Month, July 2012 - Old  Brightonians - The Alumni of Brighton CollegeThese three economists’ frequently mocking style was their way of establishing their distance from their profession. Their irony was not ornamental but actually shaped the substance of their arguments. This style limited their impact on economics, but made them highly influential outside it, because critics of economics sensed something transgressive about them.

Systematic thinkers close a subject, leaving their followers with “normal” science to fill up the learned journals. Fertile ones open up their disciplines to critical scrutiny, for which they rarely get credit.

Robert Skidelsky

The experimentalist ‘revolution’ in economics

18 Nov, 2021 at 10:34 | Posted in Statistics & Econometrics | Leave a comment

What has always bothered me about the “experimentalist” school is the false sense of certainty it conveys. The basic idea is that if we have a “really good instrument” we can come up with “convincing” estimates of “causal effects” that are not “too sensitive to assumptions.” Elsewhere I have written  an extensive critique of this experimentalist perspective, arguing it presents a false panacea, andthat allstatistical inference relies on some untestable assumptions …

Maimonides Quote: Teach thy tongue to say 'I do not know,' and thou shalt  progress.Consider Angrist and Lavy (1999), who estimate the effect of class size on student performance by exploiting variation induced by legal limits. It works like this: Let’s say a law  prevents class size from exceeding. Let’s further assume a particular school has student cohorts that average about 90, but that cohort size fluctuates between, say, 84 and 96. So, if cohort size is 91–96 we end up with four classrooms of size 22 to 24, while if cohort size is 85–90 we end up with three classrooms of size 28 to 30. By comparing test outcomes between students who are randomly assigned to the small vs. large classes (based on their exogenous birth timing), we obtain a credible estimate of the effect of class size on academic performance. Their answer is that a ten-student reduction raises scores by about 0.2 to 0.3 standard deviations.

This example shares a common characteristic of natural experiment studies, which I think accounts for much of their popularity: At first blush, the results do seem incredibly persuasive. But if you think for awhile, you start to see they rest on a host of assumptions. For example, what if schools that perform well attract more students? In this case, incoming cohort sizes are not random, and the whole logic beaks down. What if parents who care most about education respond to large class sizes by sending their kids to a different school? What if teachers assigned to the extra classes offered in high enrollment years are not a random sample of all teachers?

Michael Keane

On wokeness and islamism

17 Nov, 2021 at 17:08 | Posted in Politics & Society | Leave a comment


A thousand kisses deep

16 Nov, 2021 at 21:02 | Posted in Varia | Leave a comment


Et maintenant

16 Nov, 2021 at 20:47 | Posted in Varia | Leave a comment


Rethinking economics

15 Nov, 2021 at 17:53 | Posted in Economics | 7 Comments

marquesThe incorporation of new information makes sense only if the future is to be similar to the past. Any kind of empirical test, whatever form it adopts, will not make sense, however, if the world is uncertain because in such a world induction does not work. Past experience is not a useful guide to guess the future in these conditions (it only serves when the future, somehow, is already implicit in the present) … I believe the only way to use past experience is to assume that the world is repetitive. In a non-repetitive world in which relevant novelties unexpectedly arise testing is irrelevant …

Conceiving economic processes like sequences of events in which uncertainty reigns, where consequently there are “no laws”, nor “invariants” or “mechanisms” to discover, the kind of learning that experiments or last experience provide is of no use for the future, because it eliminates innovation and creativity and does not take into account the arboreal character and the open-ended nature of the economic process … However, as said before, we can gather precise information, restricted in space and time (data). But, what is the purpose of obtaining this sort of information if uncertainty about future events prevails? … The problem is that taking uncertainty seriously puts in question the relevance the data obtained by means of testing or experimentation has for future situations.

Marqués’ book is a serious challenge to much of mainstream economic thinking and its methodological and philosophical underpinnings. A must-read for anyone interested in the foundations of economic theory, showing how far-reaching the effects of taking Keynes’ concept of genuine uncertainty really are.

treatprobScience according to Keynes should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts.” Models can never be more than a starting point in that endeavour. He further argued that it was inadmissible to project history on the future. Consequently, we cannot presuppose that what has worked before, will continue to do so in the future. That statistical models can get hold of correlations between different ‘variables’ is not enough. If they cannot help us get at the causal structure that generated the data, they are not really ‘identified.’

How strange then that economics textbooks do not even touch upon these aspects of scientific methodology that seems to be so fundamental and important for anyone trying to understand how we learn and orient ourselves in an uncertain world! An educated guess on why this is a fact would be that Keynes’ concepts are not possible to squeeze into a single calculable numerical ‘probability.’ In the quest for quantities one puts a blind eye to qualities and looks the other way and hopempeople will forget about Keynes’ fundamental insight

Robert Lucas once wrote — in Studies in Business-Cycle Theory — that “in cases of uncertainty, economic reasoning will be of no value.”  Now, if that was true, it would put us in a tough dilemma. If we have to consider — as Lucas — uncertainty incompatible with economics being a science, and we actually know for sure that there are several and deeply important situations in real-world contexts where we — both epistemologically and ontologically — face genuine uncertainty, well, then we actually would have to choose between reality and science.

That can’t be right. We all know we do not know very much about the future. We all know the future harbours lots of unknown unknowns. Those are ontological facts we just have to accept — and still go for both reality and science, in developing a realist and relevant economic science.

Kathleen Stock — transgender and the limits of academic freedom

15 Nov, 2021 at 17:34 | Posted in Politics & Society | Leave a comment


On tour

10 Nov, 2021 at 16:49 | Posted in Varia | Leave a comment


Guest appearance in Hamburg.

Regular blogging to be resumed next week.

The presumed advantage of the experimentalist approach

9 Nov, 2021 at 15:56 | Posted in Statistics & Econometrics | Leave a comment

ECONOMETRICS IT'S OVER, IT'S DONE - Frodo | Meme GeneratorHere, I want to challenge the popular view that “natural experiments” offer a simple, robust and relatively “assumption free” way to learn interesting things about economic relationships. Indeed, I will argue that it is not possible to learn anything of interest from data without theoretical assumptions, even when one has available an “ideal instrument”. Data cannot determine interesting economic relationships without a priori identifying assumptions, regardless of what sort of idealized experiments, “natural experiments” or “quasi-experiments” are present in that data. Economic models are always needed to provide a window through which we interpret data, and our interpretation will always be subjective, in the sense that it is contingent on our model.

Furthermore, atheoretical “experimentalist” approaches do not rely on fewer or weaker assumptions than do structural approaches. The real distinction is that, in a structural approach, one’s a priori assumptions about behavior must be laid out explicitly, while in an experimentalist approach, key assumptions are left implicit …

If one accepts that inferences drawn from experimentalist work are just as contingent on a priori assumptions as those from structural work, the key presumed advantage of the experimentalist approach disappears. One is forced to accept that all empirical work in economics, whether “experimentalist” or “structural”, relies critically on a priori theoretical assumptions.

Michael Keane

In econometrics, it is often said that the error term in the regression model used represents the effect of the variables that are omitted from the model. The error term is somehow thought to be a ‘cover-all’ term representing omitted content in the model and necessary to include to ‘save’ the assumed deterministic relation between the other random variables included in the model. Error terms are usually assumed to be orthogonal (uncorrelated) to the explanatory variables. But since they are unobservable, they are also impossible to empirically test. And without justification of the orthogonality assumption, there is as a rule nothing to ensure identifiability. To me, this only highlights that the important lesson to draw from the debate between ‘structuralist’ and ‘experimentalist’ econometricians is that no matter what set of assumptions you choose to build your analysis on, you will never be able to empirically test them conclusively. Ultimately it always comes down to a question of faith.

Making sense of economics

9 Nov, 2021 at 10:31 | Posted in Economics | Leave a comment

The Assumptions Economists Make eBook : Schlefer, Jonathan: Kindle Store -  Amazon.comRobert Lucas, one of the most creative model-builders, tells a story about his undergraduate encounter with Gregor Mendel’s model of genetic inheritance. He liked the Mendelian model—“you could work out predictions that would surprise you”—though not the lab work breeding fruit flies to test it. (Economists are not big on mucking around in the real world.) Over the weekend, he enjoyed writing a paper comparing the model’s predictions with the class’s experimental results. When a friend returned from a weekend away without having written the required paper, Lucas agreed to let the friend borrow from his. The friend remarked that Lucas had forgotten to discuss how “crossing-over” could explain the substantial discrepancies between the model and experimental results. “Crossing-over is b—s—,” Lucas told his friend, a “label for our ignorance.” He kept his paper’s focus on the unadorned Mendelian model, and added a section arguing that experimental errors could explain the discrepancies. His friend instead appended a section on crossing-over. His friend got an A. Lucas got a C-minus, with a comment: “This is a good report, but you forgot about crossing-over.” Crossing-over is actually a fact; it occurs when a portion of one parent gene is incorporated in the other parent gene. But Lucas’s anecdote brilliantly illustrates the powerful temptation to model-builders—across the ideological spectrum—of ignoring inconvenient facts that don’t fit their models.

Economics may be an informative tool for research. But if its practitioners do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfil its task. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate arguments as a mixture of rather unhelpful metaphors and metaphysics.

The marginal return on its ever higher technical sophistication in no way makes up for the lack of serious under-labouring of its deeper philosophical and methodological foundations.

A rigorous application of economic methods really presupposes that the phenomena of our real-world economies are ruled by stable causal relations. Unfortunately, real-world social systems are usually not governed by stable causal relations and mechanisms. The kinds of ‘laws’ and relations that economics has established, are laws and relations about entities in models that usually presuppose causal mechanisms being invariant, atomistic and additive. But — when causal mechanisms operate in the real world they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it as a rule only because we engineered them for that purpose. Outside man-made ‘nomological machines’ they are rare, or even non-existent.

« Previous PageNext Page »

Blog at
Entries and Comments feeds.

%d bloggers like this: