Simone de Beauvoir — a pedophilia supporter?

26 Jan, 2020 at 17:08 | Posted in Politics & Society | Leave a comment

beaIt has to be said that Beauvoir’s interest in these matters was not purely theoretical … She was dismissed from her teaching job in 1943 for “behavior leading to the corruption of a minor.” The minor in question was one of her pupils at a Paris lycée. It is well established that she and Jean-Paul Sartre developed a pattern, which they called the “trio,” in which Beauvoir would seduce her students and then pass them on to Sartre …

Beauvoir’s “Lolita Syndrome” … offers an evangelical defence of the sexual emancipation of the young … Beauvoir posits Bardot as the incarnation of “authenticity” and natural, pure “desire,” with “aggressive” sexuality devoid of any hypocrisy. The author of “The Second Sex” is keen to stress sexual equality and autonomy, but she also insists on the “charms of the ‘nymph’ in whom the fearsome image of the wife and the mother is not yet visible.”

Andy Martin/New York Times

The pretence-of-knowledge syndrome

26 Jan, 2020 at 14:05 | Posted in Economics | 2 Comments

pretenceThe reaction of human beings to the truly unknown is fundamentally different from the way they deal with the risks associated with a known situation and environment … In realistic, real-time settings, both economic agents and researchers have a very limited understanding of the mechanisms at work … In trying to add a degree of complexity to the current core models, by bringing in aspects of the periphery, we are simultaneously making the rationality assumptions behind that core approach less plausible …

The challenges are big, but macroeconomists can no longer continue playing internal games … I suspect that whatever the solution ultimately is, we will accelerate our convergence to it, and reduce the damage we do along the transition, if we focus on reducing the extent of our pretense-of-knowledge syndrome.

Ricardo J. Caballero

Caballero’s article underlines — especially when it comes to forecasting and implementing economic policies  — that the future is inherently unknowable, and using statistics, econometrics, decision theory or game theory, does not in the least overcome this ontological fact.

Uncertainty is something that has to be addressed and not only assumed away. To overcome the feeling of hopelessness when confronting ‘unknown unknowns’, it is important — in economics in particular — to incorporate Keynes’s far-reaching and incisive analysis of induction and evidential weight in A Treatise on Probability (1921).

treatprobAccording to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but “rational expectations.” Keynes rather thinks that we base our expectations on the confidence or “weight” we put on different events and alternatives. To Keynes, expectations are a question of weighing probabilities by “degrees of belief,” beliefs that often have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents as modelled by “modern” social sciences. And often we “simply do not know.”

How strange that social scientists and mainstream economists, as a rule, do not even touch upon these aspects of scientific methodology that seems to be so fundamental and important for anyone trying to understand how we learn and orient ourselves in an uncertain world. An educated guess on why this is a fact would be that Keynes concepts are not possible to squeeze into a single calculable numerical “probability.” In the quest for measurable quantities, one puts a blind eye to qualities and looks the other way.

So why do economists, companies and governments continue with the expensive, but obviously worthless, activity of trying to forecast/predict the future?

Some time ago yours truly was interviewed by a public radio journalist working on a series on Great Economic ThinkersWe were discussing the monumental failures of the predictions-and-forecasts-business. But — the journalist asked — if these cocksure economists with their “rigorous” and “precise” mathematical-statistical-econometric models are so wrong again and again — why do they persist wasting time on it?

In a discussion on uncertainty and the hopelessness of accurately modelling what will happen in the real world — in M. Szenberg’s Eminent Economists: Their Life Philosophies — Nobel laureate Kenneth Arrowcomes up with what is probably the most plausible reason:

It is my view that most individuals underestimate the uncertainty of the world. This is almost as true of economists and other specialists as it is of the lay public. To me our knowledge of the way things work, in society or in nature, comes trailing clouds of vagueness … Experience during World War II as a weather forecaster added the news that the natural world as also unpredictable. cloudsAn incident illustrates both uncer-tainty and the unwilling-ness to entertain it. Some of my colleagues had the responsi-bility of preparing long-range weather forecasts, i.e., for the following month. The statisticians among us subjected these forecasts to verification and found they differed in no way from chance. The forecasters themselves were convinced and requested that the forecasts be discontinued. The reply read approximately like this: ‘The Commanding General is well aware that the forecasts are no good. However, he needs them for planning purposes.’

Uncertainty in economics

25 Jan, 2020 at 16:42 | Posted in Economics | 3 Comments

kadeNot accounting for uncertainty may result in severe confusion about what we do indeed understand about the economy. In the financial crisis of 2007/2008 the demon has lashed out at this ignorance and challenged the credibility of the whole economic community by laying bare economists’ incapability to prevent the crisis …

Economics itself cannot be regarded a purely analytical science. It has the amazing and exciting property of shaping the object of its own analysis. This feature clearly distinguishes it from physics, chemistry, archaeology and many other sciences. While biologists, chemists, engineers, physicists and many more are very able to transform whole societies by their discoveries and inventions — like Penicillin or the internet — the laws of nature they study remain unaffected by these inventions. In economic, this constancy of the object under study just does not exist.

The financial crisis of 2007-2008 hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even made it conceivable?

There are many who have ventured to answer that question. And they have come up with a variety of answers, ranging from the exaggerated mathematization of economics, to irrational and corrupt politicians.

0But the root of our problem goes much deeper. It ultimately goes back to how we look upon the data we are handling. In ‘modern’ macroeconomics — Dynamic Stochastic General Equilibrium, New Synthesis, New Classical and New ‘Keynesian’ — variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the ‘data-generating process’ – if we do not have the ‘true’ model – the whole edifice collapses. And of course it has to. I mean, who honestly believes that we should have access to this mythical Holy Grail, the data-generating process?

‘Modern’ macroeconomics obviously did not anticipate the enormity of the problems that unregulated ‘efficient’ financial markets created. Why? Because it builds on the myth of us knowing the ‘data-generating process’ and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

4273570080_b188a92980This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30%, and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But as Keynes convincingly argued in his monumental Treatise on Probability (1921), this is not always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally good – model the chance is perhaps somewhere around 40%. We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

In the end this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty-type. The data do not unequivocally single out one decision as the only ‘rational’ one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

wrongrightSome macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better — how much bigger chance that we do not lull us into the comforting thought that we know everything and that everything is measurable and we have everything under control — if instead we could just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing — economic disaster.

En kyss med smak av oljeblandad bensin och fett

25 Jan, 2020 at 12:45 | Posted in Varia | Leave a comment

 

Hovern’ engan

24 Jan, 2020 at 17:46 | Posted in Varia | Leave a comment

 

Modell och verklighet i nationalekonomi

24 Jan, 2020 at 17:21 | Posted in Economics | Leave a comment

Nationalekonomi är mer än någon annan samhällsvetenskap modellorienterad. Det finns många skäl till detta — ämnets historia, ideal hämtade från naturvetenskapen, universalitetsanpråk, viljan att förklara så mycket som möjligt med så lite som möjligt, rigör, precision med mera.

Tillvägagångssättet är i grunden analytiskt — helheten bryts ned i sina beståndsdelar så att det blir möjligt att förklara (reducera) aggregatet (makro) som ett resultat av interaktion mellan delarna (mikro).
ec modMainstreamekonomer baserar i regel sina modeller på ett antal kärnantaganden (CA) — som i grunden beskriver aktörer som ‘rationella’ — samt ett antal auxiliära antaganden (AA). Tillsammans utgör (CA) och (AA) vad vi skulle kunna kalla ’basmodellen’ (M) för alla mainstreammodeller. Baserat på dessa två uppsättningar av antaganden försöker man förklara och predicera både individuella (mikro) och samhälleliga fenomen (makro).

Kärnantagandena består typiskt av:
CA1 Fullständighet – den rationella aktören förmår alltid jämföra olika alternativ och bestämma vilket hon föredrar
CA2 Transitivitet – om aktören föredrar A framför B, och B framför C, måste hon föredra A framför C
CA3 Icke-mättnad — mer är alltid bättre än mindre
CA4 Maximering av förväntad nytta – i situationer känneteckade av risk maximerar aktören alltid den förväntade nyttan
CA5 Konsistenta ekonomiska jämvikter – olika aktörers handlande är konsistenta och interaktionen dem emellan resulterar i en jämvikt

När man beskriver aktörer som rationella i de här modellerna avser man instrumentell rationalitet, som innebär att aktörer förutsätts välja alternativ som har de bästa konsekvenserna utifrån deras givna preferenser. Hur dessa givna preferenser har uppkommit uppfattas i regel ligga utanför rationalitetsbegreppets ’omfång’ och därför inte heller utgör en del av den ekonomiska teorin som sådan.

Bilden man får av kärnantagandena (’rationella val’) är en rationell aktör med starka kognitiva kapaciteter, som vet vad hon vill, noga överväger sina alternativ och givet sina preferenser väljer vad hon tror har de bästa konsekvenserna för henne. Vägandes de olika alternativen mot varandra gör aktören ett konsistent, rationellt val och agerar utifrån detta.

De auxiliära antagandena (AA) specificerar rums- och tidmässigt vad för typ av interaktion som kan äga rum mellan ‘rationella’ aktörer. Antagandena ger ofta svar på frågor som:
AA1 vilka är aktörerna och var och när interagerar de
AA2 vilka är deras mål och aspirationer
AA3 vilka intressen har de
AA4 vilka är deras förväntningar
AA5 vad för slags handlingsutrymme har de
AA6 vilket slags överenskommelser kan de ingå
AA7 hur mycket och vad för slags information besitter de
AA8 hur interagerar deras handlingar med varandra

Så ‘basmodellen’ för alla mainstream-modeller består av en generell bestämning av vad som (axiomatiskt) utgör optimerande rationella aktörer (CA) samt en mer specifik beskrivning (AA) av i vad för slags situationer som dessa aktörer agerar (vilket innebär att AA fungerar som en restriktion som bestämmer den tilltänkta applikationsdomänen för CA och de därur deduktivt härledda teoremen). Listan över antaganden kan aldrig bli fullständig eftersom det alltid också förekommer ospecificerade ’bakgrundsantaganden’ och opåtalade utelämnanden (typ transktionskostnader, slutningar, o d, ofta baserat på något slags negligerbarhets- och applikationsöverväganden). Förhoppningen är att denna ’tunna’ uppsättning antaganden ska vara tillräcklig för att förklara och predicera ’fylliga’ fenomen i den verkliga, komplexa världen.

Continue Reading Modell och verklighet i nationalekonomi…

The intellectual regress of macroeconomics

21 Jan, 2020 at 17:45 | Posted in Economics | 1 Comment

Real business cycle theory — RBC — is one of the theories that has put macroeconomics on a path of intellectual regress for three decades now. And although there are many kinds of useless ‘post-real’ economics held in high regard within mainstream economics establishment today, few — if any — are less deserved than real business cycle theory.

The future is not reducible to a known set of prospects. It is not like sitting at the roulette table and calculating what the future outcomes of spinning the wheel will be. So instead of — as RBC economists do — assuming calibration and rational expectations to be right, one ought to confront the hypothesis with the available evidence. It is not enough to construct models. Anyone can construct models. To be seriously interesting, models have to come with an aim. They have to have an intended use. If the intention of calibration and rational expectations is to help us explain real economies, it has to be evaluated from that perspective. A model or hypothesis without specific applicability is not really deserving of our interest.

Without strong evidence, all kinds of absurd claims and nonsense may pretend to be science. We have to demand more of a justification than rather watered-down versions of ‘anything goes’ when it comes to rationality postulates. If one proposes rational expectations one also has to support its underlying assumptions. None is given by RBC economists, which makes it rather puzzling how rational expectations has become the standard modelling assumption made in much of modern macroeconomics. Perhaps the reason is that economists often mistake mathematical beauty for truth.

In the hands of Lucas, Prescott and Sargent, rational expectations have been transformed from an — in-principle — testable hypothesis to an irrefutable proposition. Believing in a set of irrefutable propositions may be comfortable – like religious convictions or ideological dogmas – but it is not science.

So where does this all lead us? What is the trouble ahead for economics? Putting a sticky-price DSGE lipstick on the RBC pig sure won’t do. Neither will — as Paul Romer noticed  — just looking the other way and pretend it’s raining:

The trouble is not so much that macroeconomists say things that are inconsistent with the facts. The real trouble is that other economists do not care that the macroeconomists do not care about the facts. An indifferent tolerance of obvious error is even more corrosive to science than committed advocacy of error.

Dokumentären som renar min själ (personal)

21 Jan, 2020 at 17:22 | Posted in Varia | Leave a comment

I alla moderna människors liv behövs det tid för andhämtning och reflektion. Och ibland — när alla möjliga och omöjliga måsten och krav från omgivningen bara blir för många och högljudda — kan det vara skönt att dra sig undan lite grand och slå av på takten för en stund.

Alla har vi väl olika sätt att göra det på. Själv brukar jag gå in på Öppet Arkiv och titta på Gubben i stugan — Nina Hedenius underbara dokumentärfilm om den pensionerade skogsarbetaren Ragnars liv i Dalarnas finnskogar.

Enkelt. Vackert. En lisa för själen.

On causality and econometrics

20 Jan, 2020 at 11:45 | Posted in Statistics & Econometrics | Leave a comment

causal-inference-in-statistics-233x165The point is that a superficial analysis, which only looks at the numbers, without attempting to assess the underlying causal structures, cannot lead to a satisfactory data analysis … We must go out into the real world and look at the structural details of how events occur … The idea that the numbers by themselves can provide us with causal information is false. It is also false that a meaningful analysis of data can be done without taking any stand on the real-world causal mechanism … These issues are of extreme important with reference to Big Data and Machine Learning. Machines cannot expend shoe leather, and enormous amounts of data cannot provide us knowledge of the causal mechanisms in a mechanical way. However, a small amount of knowledge of real-world structures used as causal input can lead to substantial payoffs in terms of meaningful data analysis. The problem with current econometric techniques is that they do not have any scope for input of causal information – the language of econometrics does not have the vocabulary required to talk about causal concepts.

Asad Zaman / WEA Pedagogy

What Asad Zaman tells us in his splendid set of lectures is that causality in social sciences can never solely be a question of statistical inference. Causality entails more than predictability, and to really in depth explain social phenomena require theory. Analysis of variation — the foundation of all econometrics — can never in itself reveal how these variations are brought about. First, when we are able to tie actions, processes or structures to the statistical relations detected, can we say that we are getting at relevant explanations of causation.

5cd674ec7348d0620e102a79a71f0063Most facts have many different, possible, alternative explanations, but we want to find the best of all contrastive (since all real explanation takes place relative to a set of alternatives) explanations. So which is the best explanation? Many scientists, influenced by statistical reasoning, think that the likeliest explanation is the best explanation. But the likelihood of x is not in itself a strong argument for thinking it explains y. I would rather argue that what makes one explanation better than another are things like aiming for and finding powerful, deep, causal, features and mechanisms that we have warranted and justified reasons to believe in. Statistical — especially the variety based on a Bayesian epistemology — reasoning generally has no room for these kinds of explanatory considerations. The only thing that matters is the probabilistic relation between evidence and hypothesis. That is also one of the main reasons I find abduction — inference to the best explanation — a better description and account of what constitute actual scientific reasoning and inferences.

Some statisticians and data scientists think that algorithmic formalisms somehow give them access to causality. That is, however, simply not true. Assuming ‘convenient’ things like faithfulness or stability is not to give proofs. It’s to assume what has to be proven. Deductive-axiomatic methods used in statistics do no produce evidence for causal inferences. The real causality we are searching for is the one existing in the real world around us. If there is no warranted connection between axiomatically derived theorems and the real-world, well, then we haven’t really obtained the causation we are looking for.

Alternatives to mainstream economics

20 Jan, 2020 at 09:03 | Posted in Economics | 3 Comments

 

Levon Minassian

19 Jan, 2020 at 17:58 | Posted in Varia | 3 Comments

 

Politik och marknad — ett seminarium

19 Jan, 2020 at 17:46 | Posted in Varia | Leave a comment

pol och markBehovet av radikal samhällsanalys och ideologikritik har sällan varit större än nu, drygt femtio år efter att tidskriften Häften för kritiska studier först såg dagens ljus.

Med utgångspunkt i en nyutkommen antologi med elva forskare diskuteras i tidskriftens anda aktuella trender i vår tids kapitalism. Är USA-imperialismen nere för räkning? Innebär den digitala plattformsekonomin en fjärde industriell revolution? Varför kan traditionell nationalekonomi inte förklara samhällsekonomins omvandling? Hur ska man analysera nyliberaliseringen av europeisk socialdemokrati? Och hur har feminismen utvecklats?

Dessa och andra frågor tas upp vid seminariet, där några av antologins författare medverkar.

Lars Ekdahl ger en översikt av boken och lite om dess bakgrund, följt av några kapitelpresentationer. Yvonne Hirdman talar om feminismens vägar och avvägar, Stefan de Vylder om USA:s försvagade globala roll under Trump, Anders Gullberg om den nya plattformskapitalismen, Lars Pålsson Syll om nationalekonomins metodologiska kollaps.

Seminariet äger rum den 22 januari kl. 18.00-19.30, Beskowsalen på ABF, Sveav. 84, Stockholm. Fri entré.

Dialogos Förlag

Vänsterpartiets svek mot invandrarkvinnorna

19 Jan, 2020 at 17:22 | Posted in Politics & Society | 1 Comment

Amine Kakabaveh är kurd med rötter i Iran … I somras uteslöts hon ur Vänsterpartiet efter flera års konflikt med partiledningen. Som riksdagsledamot drev hon stenhårt frågor om hedersförtryck – något som man i vänstern länge inte vågade tala högt om.
– De sa hela tiden till mig att jag underblåser rasism när jag pratar om att flickor blir slagna och bortgifta. Men det är ju de som tiger som underblåser rasism. Tack vara att de varit så fega har SD vuxit …

kaRädslan för sanningen – att många invandrarkvinnor lever under ett religiöst och kulturellt patriarkalt förtryck – är fortfarande utbredd i partiet.
– Jonas svek invandrarkvinnor. Han var feg.

I föreningen “Varken Hora eller Kuvad” är Amine Kakabaveh frontfigur. Enligt föreningen har hedersförtrycket ökat sedan mordet på Fadime. Tusentals kvinnor och flickor lever med moralpoliser, både inom och utom familjen, som ska bestämma om hur de ska klä sig och uppföra sig.
– Problemet är att det svenska samhället vill att vi behåller vår egen kultur. Sverige är det enda land i hela Europa där man betalar hundratals miljoner till olika etniska och religiösa föreningar utan att ens ställa krav.
–Samhället ska inte bidra till att upprätthålla fundamentalism, hedersförtryck och patriarkala strukturer med skattepengar.

Olle Lönnaeus/Sydsvenskan

Experiments in social sciences

18 Jan, 2020 at 22:30 | Posted in Statistics & Econometrics | 1 Comment

du2How, then, can social scientists best make inferences about causal effects? One option is true experimentation … Random assignment ensures that any differences in outcomes between the groups are due either to chance error or to the causal effect … If the experiment were to be repeated over and over, the groups would not differ, on average, in the values of potential confounders. Thus, the average of the average difference of group outcomes, across these many experiments, would equal the true difference in outcomes … The key point is that randomization is powerful because it obviates confounding …

Thad Dunning’s book is a very useful guide for social scientists interested in research methodology in general and natural experiments in specific. Dunning argues that since random or as-if random assignment in natural experiments obviates the need for controlling potential confounders, this kind of “simple and transparent” design-based research method is preferable to more traditional multivariate regression analysis where the controlling only comes in ex post via statistical modelling.

But — there is always a but …

The point of making a randomized experiment is often said to be that it ‘ensures’ that any correlation between a supposed cause and effect indicates a causal relation. This is believed to hold since randomization (allegedly) ensures that a supposed causal variable does not correlate with other variables that may influence the effect.

The problem with that simplistic view on randomization is that the claims made are exaggerated and sometimes even false.

Since most real-world experiments and trials build on performing a finite amount of randomization, what would happen if you kept on randomizing forever, does not help you to ‘ensure’ or ‘guarantee’ that you do not make false causal conclusions in the one particular randomized experiment you actually do perform. It is indeed difficult to see why thinking about what you know you will never do, would make you happy about what you actually do.

In econometrics one often gets the feeling that many of its practitioners think of it as a kind of automatic inferential machine: input data and out comes causal knowledge. This is like pulling a rabbit from a hat. Great — but first you have to put the rabbit in the hat. And this is where assumptions come into the picture.

The assumption of imaginary ‘super populations’ is one of the many dubious assumptions used in modern econometrics.

As social scientists — and economists — we have to confront the all-important question of how to handle uncertainty and randomness. Should we define randomness with probability? If we do, we have to accept that to speak of randomness we also have to presuppose the existence of nomological probability machines, since probabilities cannot be spoken of — and actually, to be strict, do not at all exist — without specifying such system-contexts. Accepting a domain of probability theory and sample space of infinite populations also implies that judgments are made on the basis of observations that are actually never made!

Infinitely repeated trials or samplings never take place in the real world. So that cannot be a sound inductive basis for science with aspirations of explaining real-world socio-economic processes, structures or events. It’s not tenable. Why should we as social scientists — and not as pure mathematicians working with formal-axiomatic systems without the urge to confront our models with real target systems — unquestioningly accept models based on concepts like the ‘infinite super populations’ used in e.g. the ‘potential outcome’ framework that has become so popular lately in social sciences?

One could, of course, treat observational or experimental data as random samples from real populations. I have no problem with that (although it has to be noted that most ‘natural experiments’ are not based on random sampling from some underlying population — which, of course, means that the effect-estimators, strictly seen, only are unbiased for the specific groups studied). But probabilistic econometrics does not content itself with that kind of populations. Instead, it creates imaginary populations of ‘parallel universes’ and assume that our data are random samples from that kind of  ‘infinite super populations.’

In social sciences — including economics — it’s always wise to ponder C. S. Peirce’s remark that universes are not as common as peanuts …

Quinn Slobodian and the birth of neoliberalism

18 Jan, 2020 at 13:55 | Posted in Economics, Politics & Society | 1 Comment

 

It is a measure of the success of this fascinating, innovative history that it forces the question: after Slobodian’s reinterpretation, where does the critique of neoliberalism stand?

First and foremost, Slobodian has underlined the profound conservatism of the first generation of neoliberals and their fundamental hostility to democracy. What he has exposed, furthermore, is their deep commitment to empire as a restraint on the nation state. Notably, in the case of Wilhelm Röpke, this was reinforced by deep-seated anti-black racism. Throughout the 1960s Röpke was active on behalf of South Africa and Rhodesia in defense of what he saw as the last bastions of white civilization in the developing world. As late as the 1980s, members of the Mont Pèlerin Society argued that the white minority in South Africa could best be defended by weighting the voting system by the proportion of taxes paid. If this was liberalism it was not so much neo- as paleo-.

Adam Tooze

Next Page »

Blog at WordPress.com.
Entries and comments feeds.