Dievaines

18 June, 2017 at 19:06 | Posted in Varia | Leave a comment

 

Absolutely fabulous!

Der Wind hat sich gedreht

18 June, 2017 at 16:21 | Posted in Varia | Leave a comment

 

Back in 1980 yours truly had the pleasure of studying at University of Vienna. When not studying or paying weekly visits to Berggase 19, I used to listen to songs like this on my portable music player. To me it is as true today as it was to Degenhardt in 1980, that the only thing we seem to learn from history is that a lot of people don’t (want to) learn anything from it …

Nationalekonomi — ett annat slags vetenskap

18 June, 2017 at 14:33 | Posted in Economics | Leave a comment

davidsonEn national-ekonomi, som antar, att dess föremål är en ren naturföreteelse eller blott ett tankeexperiment, är icke någon verklig national-ekonomi, utan ett annat slags vetenskap …

 
  

‘National-ekonomien i stöpsleven,’ 1936

 
 

Proving gender discrimination using randomization (student stuff)

17 June, 2017 at 10:06 | Posted in Statistics & Econometrics | Comments Off on Proving gender discrimination using randomization (student stuff)

 

Säsongsavslutning

17 June, 2017 at 08:54 | Posted in Economics | Comments Off on Säsongsavslutning

radioI en tid när ljudrummet dränks i den kommersiella radions tyckmyckentrutade ordbajseri och fullständigt intetsägande pubertalflamsande tjafs har många av oss mer eller mindre gett upp. Radion, som en gång i tiden var en källa till både vederkvickelse och reflexion har degenererat till en postmodern ytlighetsavgud.

Men det finns ljus i mörkret!

3270059_612_344I programmet Text och musik med Eric Schüldt — som sänds på söndagsförmiddagarna i P2 mellan klockan 11 och 12 — kan man lyssna på seriös musik och en programledare som har något att säga och inte bara låter foderluckan glappa.

En lisa för själen.

säsongsavslutningsprogrammet spelades bl. a. detta vackra stycke av Tjajkovskij:

Ed Leamer and the pitfalls of econometrics

16 June, 2017 at 18:09 | Posted in Statistics & Econometrics | 1 Comment

Ed Leamer’s Tantalus on the Road to Asymptopia is one of my favourite critiques of econometrics, and for the benefit of those who are not versed in the econometric jargon, this handy summary gives the gist of it in plain English:

noahtantalus
 
Most work in econometrics and regression analysis is — still — made on the assumption that the researcher has a theoretical model that is ‘true.’ Based on this belief of having a correct specification for an econometric model or running a regression, one proceeds as if the only problem remaining to solve have to do with measurement and observation.

aWhen things sound to good to be true, they usually aren’t. And that goes for econometric wet dreams too. The snag is, as Leamer convincingly argues, that there is pretty little to support the perfect specification assumption. Looking around in social science and economics we don’t find a single regression or econometric model that lives up to the standards set by the ‘true’ theoretical model — and there is pretty little that gives us reason to believe things will be different in the future.

To think that we are being able to construct a model where all relevant variables are included and correctly specify the functional relationships that exist between them, is  not only a belief without support, but a belief impossible to support.

The theories we work with when building our econometric regression models are insufficient. No matter what we study, there are always some variables missing, and we don’t know the correct way to functionally specify the relationships between the variables.

Every regression model constructed is misspecified. There are always an endless list of possible variables to include, and endless possible ways to specify the relationships between them. So every applied econometrician comes up with his own specification and ‘parameter’ estimates. The econometric Holy Grail of consistent and stable parameter-values is nothing but a dream.

overconfidenceIn order to draw inferences from data as described by econometric texts, it is necessary to make whimsical assumptions. The professional audience consequently and properly withholds belief until an inference is shown to be adequately insensitive to the choice of assumptions. The haphazard way we individually and collectively study the fragility of inferences leaves most of us unconvinced that any inference is believable. If we are to make effective use of our scarce data resource, it is therefore important that we study fragility in a much more systematic way. If it turns out that almost all inferences from economic data are fragile, I suppose we shall have to revert to our old methods …

Ed Leamer

A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables.  Parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

The theoretical conditions that have to be fulfilled for regression analysis and econometrics to really work are nowhere even closely met in reality. Making outlandish statistical assumptions does not provide a solid ground for doing relevant social science and economics. Although regression analysis and econometrics have become the most used quantitative methods in social sciences and economics today, it’s still a fact that the inferences made from them are invalid.

Econometrics — and regression analysis — is basically a deductive method. Given the assumptions (such as manipulability, transitivity, separability, additivity, linearity, etc) it delivers deductive inferences. The problem, of course, is that we will never completely know when the assumptions are right. Conclusions can only be as certain as their premises — and that also applies to econometrics and regression analysis.

What is it that DSGE models — really — explain?

16 June, 2017 at 16:55 | Posted in Economics | 3 Comments

Now it is “dynamic stochastic general equilibrium” (DSGE) models inspired by the Lucas critique that have failed to predict or even explain the Great Recession of 2007–2009. More precisely, the implicit “explanations” based on these models are that the recession, including the millions of net jobs lost, was primarily due to large negative shocks to both technology and willingness to work … So can the reputation of modern macroeconomics be rehabilitated by simply modifying DSGE models to include a few more realistic shocks? …

A simple example helps illustrate for the uninitiated just how DSGE models work and why it should come as little surprise that they are largely inadequate for the task of explaining the Great Recession.

For this simple DSGE model, consider the following technical assumptions: i) an infinitely-lived representative agent with rational expectations and additive utility in current and discounted future log consumption and leisure; ii) a Cobb-Douglas aggregate production function with labor-augmenting technology; iii) capital accumulation with a fixed depreciation rate; and iv) a stochastic process for exogenous technology shocks …

wrong-tool-by-jerome-awIt is worth making two basic points about the setup. First, by construction, technology shocks are the only underlying source of fluctuations in this simple model. Thus, if we were to assume that U.S. real GDP was the literal outcome of this model, we would be assuming a priori that fluctuations in real GDP were ultimately due to technology. When faced with the Great Recession, this model would have no choice but to imply that technology shocks were somehow to blame. Second, despite the underlying role of technology, the observed fluctuations in real GDP can be divided into those that directly reflect the behavior of the exogenous shocks and those that reflect the endogenous capital accumulation in response to these shocks.

To be more precise about these two points, it is necessary to assume a particular process for the exogenous technology shocks. In this case, let’s assume technology follows a random walk with drift [and assuming a 100% depreciation rate of capital]…

So, with this simple DSGE model and for typical measures of the capital share, we have the implication that output growth follows an AR(1) process with an AR coefficient of about one third. This is notable given that such a time-series model does reasonably well as a parsimonious description of quarterly real GDP dynamics for the U.S. economy …

However, the rather absurd assumption of a 100% depreciation rate at the quarterly horizon would surely still have prompted a sharp question or two in a University of Chicago seminar back in the days. So, with this in mind, what happens if we consider the more general case?

binary-options-bunny-tophat-magicianUnfortunately, for more realistic depreciation rates, we cannot solve the model analytically. Instead, taking a log-linearization around steady state, we can use standard methods to solve for output growth … This simple DSGE model is able to mimic the apparent AR(1) dynamics in real GDP growth. But it does so by assuming the exogenous technology shocks also follow an AR(1) process with an AR coefficient that happens to be the same as the estimated AR coefficient for output growth. Thus, the magic trick has been revealed: a rabbit was stuffed into the hat and then a rabbit jumped out of the hat …

Despite their increasing sophistication, DSGE models share one key thing in common with their RBC predecessors. After more than two decades of earnest promises to do better in the “future directions” sections of academic papers, they still have those serially-correlated shocks. Thus, the models now “explain” variables like real GDP, inflation, and interest rates as the outcome of more than just serially-correlated technology shocks. They also consider serially-correlated preference shocks and serially-correlated policy shocks …

James Morley

And still mainstream economists seem to be impressed by the ‘rigour’ brought to macroeconomics by New-Classical-New-Keynesian DSGE models and its rational expectations and micrcofoundations!

It is difficult to see why.

Take the rational expectations assumption. Rational expectations in the mainstream economists’ world implies that relevant distributions have to be time independent. This amounts to assuming that an economy is like a closed system with known stochastic probability distributions for all different events. In reality it is straining one’s beliefs to try to represent economies as outcomes of stochastic processes. An existing economy is a single realization tout court, and hardly conceivable as one realization out of an ensemble of economy-worlds, since an economy can hardly be conceived as being completely replicated over time. It is — to say the least — very difficult to see any similarity between these modelling assumptions and the expectations of real persons. In the world of the rational expectations hypothesis we are never disappointed in any other way than as when we lose at the roulette wheels. But real life is not an urn or a roulette wheel. And that’s also the reason why allowing for cases where agents make ‘predictable errors’ in DSGE models doesn’t take us any closer to a relevant and realist depiction of actual economic decisions and behaviours. If we really want to have anything of interest to say on real economies, financial crisis and the decisions and choices real people make we have to replace the rational expectations hypothesis with more relevant and realistic assumptions concerning economic agents and their expectations than childish roulette and urn analogies.

‘Rigorous’ and ‘precise’ DSGE models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

No matter how precise and rigorous the analysis, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one single millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not say anything about real world economies.

Proving things ‘rigorously’ in DSGE models is at most a starting-point for doing an interesting and relevant economic analysis. Forgetting to supply export warrants to the real world makes the analysis an empty exercise in formalism without real scientific value.

Mainstream economists think there is a gain from the DSGE style of modeling in its capacity to offer some kind of structure around which to organise discussions. To me that sounds more like a religious theoretical-methodological dogma, where one paradigm rules in divine hegemony. That’s not progress. That’s the death of economics as a science.

Stjärnorna kvittar det lika (personal)

16 June, 2017 at 09:45 | Posted in Varia | Comments Off on Stjärnorna kvittar det lika (personal)

 

 

In loving memory of my mother. Nils Ferlin was her favourite poet.

What kind of realist am I?

14 June, 2017 at 17:51 | Posted in Theory of Science & Methodology | 1 Comment

Some commentators on this blog seem to be of the opinion that since yours truly is critical of mainstream economics and ask for more relevance and realism I’m bound to be a “naive” realist or empiricist.

Nothing could be further from the truth!

bhaskarIn a time when scientific relativism is expanding, it is important to keep up the claim for not reducing science to a pure discursive level. We have to maintain the Enlightenment tradition of thinking of reality as principally independent of our views of it and of the main task of science as studying the structure of this reality. Perhaps the most important contribution a researcher can make is reveal what this reality that is the object of science actually looks like.

Science is made possible by the fact that there are structures that are durable and are independent of our knowledge or beliefs about them. There exists a reality beyond our theories and concepts of it. It is this independent reality that our theories in some way deal with. Contrary to positivism, I cannot see that the main task of science is to detect event-regularities between observed facts. Rather, the task must be conceived as identifying the underlying structure and forces that produce the observed events.

The problem with positivist social science is not that it gives the wrong answers, but rather that in a strict sense it does not give answers at all. Its explanatory models presuppose that the social reality is ‘closed,’ and since social reality is fundamentally ‘open,’ models of that kind cannot explain anything of what happens in such a universe. Positivist social science has to postulate closed conditions to make its models operational and then – totally unrealistically – impute these closed conditions to society’s real structure.

In the face of the kind of methodological individualism and rational choice theory that dominate positivist social science we have to admit that even if knowing the aspirations and intentions of individuals are necessary prerequisites for giving explanations of social events, they are far from sufficient. Even the most elementary ‘rational’ actions in society presuppose the existence of social forms that it is not possible to reduce to the intentions of individuals.

archerThe overarching flaw with methodological individualism and rational choice theory is basically that they reduce social explanations to purportedly individual characteristics. But many of the characteristics and actions of the individual originate in and are made possible only through society and its relations. Society is not reducible to individuals, since the social characteristics, forces, and actions of the individual are determined by pre-existing social structures and positions. Even though society is not a volitional individual, and the individual is not an entity given outside of society, the individual (actor) and the society (structure) have to be kept analytically distinct. They are tied together through the individual’s reproduction and transformation of already given social structures.

What makes knowledge in social sciences possible is the fact that society consists of social structures and positions that influence the individuals of society, partly through their being the necessary prerequisite for the actions of individuals but also because they dispose individuals to act (within a given structure) in a certain way. These structures constitute the ‘deep structure’ of society.

Our observations and theories are concept-dependent without therefore necessarily being concept-determined. There is a reality existing independently of our knowledge and theories of it. Although we cannot apprehend it without using our concepts and theories, these are not the same as reality itself. Reality and our concepts of it are not identical. Social science is made possible by existing structures and relations in society that are continually reproduced and transformed by different actors.

Explanations and predictions of social phenomena require theory constructions. Just looking for correlations between events is not enough. One has to get under the surface and see the deeper underlying structures and mechanisms that essentially constitute the social system.

The basic question one has to pose when studying social relations and events is what are the fundamental relations without which they would cease to exist. The answer will point to causal mechanisms and tendencies that act in the concrete contexts we study. Whether these mechanisms are activated and what effects they will have in that case it is not possible to predict, since these depend on accidental and variable relations. Every social phenomenon is determined by a host of both necessary and contingent relations, and it is impossible in practice to have complete knowledge of these constantly changing relations. That is also why we can never confidently predict them. What we can do, through learning about the mechanisms of the structures of society, is to identify the driving forces behind them, thereby making it possible to indicate the direction in which things tend to develop.

The world itself should never be conflated with the knowledge we have of it. Science can only produce meaningful, relevant and realist knowledge if it acknowledges its dependence of  the world out there. Ultimately that also means that the critique yours truly wages against mainstream economics is that it doesn’t take that ontological requirement seriously.

Solow being uncomfortable with ‘modern’ macroeconomics

12 June, 2017 at 18:53 | Posted in Economics | Comments Off on Solow being uncomfortable with ‘modern’ macroeconomics

4703325-2So in what sense is this “dynamic stochastic general equilibrium” model firmly grounded in the principles of economic theory? I do not want to be misunderstood. Friends have reminded me that much of the effort of “modern macro” goes into the incorporation of important deviations from the Panglossian assumptions that underlie the simplistic application of the Ramsey model to positive macroeconomics. Research focuses on the implications of wage and price stickiness, gaps and asymmetries of information, long-term contracts, imperfect competition, search, bargaining and other forms of strategic behavior, and so on. That is indeed so, and it is how progress is made.

But this diversity only intensifies my uncomfortable feeling that something is being put over on us, by ourselves. Why do so many of those research papers begin with a bow to the Ramsey model and cling to the basic outline? Every one of the deviations that I just mentioned was being studied by macroeconomists before the “modern” approach took over. That research was dismissed as “lacking microfoundations.” My point is precisely that attaching a realistic or behavioral deviation to the Ramsey model does not confer microfoundational legitimacy on the combination. Quite the contrary: a story loses legitimacy and credibility when it is spliced to a simple, extreme, and on the face of it, irrelevant special case. This is the core of my objection: adding some realistic frictions does not make it any more plausible that an observed economy is acting out the desires of a single, consistent, forward-looking intelligence …

For completeness, I suppose it could also be true that the bow to the Ramsey model is like wearing the school colors or singing the Notre Dame fight song: a harmless way of providing some apparent intellectual unity, and maybe even a minimal commonality of approach. That seems hardly worthy of grown-ups, especially because there is always a danger that some of the in-group come to believe the slogans, and it distorts their work …

There has always been a purist streak in economics that wants everything to follow neatly from greed, rationality, and equilibrium, with no ifs, ands, or buts. Most of us have felt that tug. Here is a theory that gives you just that, and this
time “everything” means everything: macro, not micro. The theory is neat, learnable, not terribly difficult, but just technical enough to feel like “science.”

Robert Solow

Yes, indeed, there certainly is a “purist streak in economics that wants everything to follow neatly from greed, rationality, and equilibrium, with no ifs, ands, or buts.” That purist streak has given birth to a kind ‘deductivist blindness’ of mainstream economics, something that also to a larger extent explains why it contributes to causing economic crises rather than to solving them. But where does this ‘deductivist blindness’ of mainstream economics come from? To answer that question we have to examine the methodology of mainstream economics.

The insistence on constructing models showing the certainty of logical entailment has been central in the development of mainstream economics. Insisting on formalistic (mathematical) modeling has more or less forced the economist to give upon on realism and substitute axiomatics for real world relevance. The price paid for the illusory rigour and precision has been monumentally high

wrong-tool-by-jerome-awThis deductivist orientation is the main reason behind the difficulty that mainstream economics has in terms of understanding, explaining and predicting what takes place in our societies. But it has also given mainstream economics much of its discursive power – at least as long as no one starts asking tough questions on the veracity of – and justification for – the assumptions on which the deductivist foundation is erected. Asking these questions is an important ingredient in a sustained critical effort at showing how nonsensical is the embellishing of a smorgasbord of models founded on wanting (often hidden) methodological foundations.

The mathematical-deductivist straitjacket used in mainstream economics presupposes atomistic closed-systems — i.e., something that we find very little of in the real world, a world significantly at odds with an (implicitly) assumed logic world where deductive entailment rules the roost. Ultimately then, the failings of modern mainstream economics has its root in a deficient ontology. The kind of formal-analytical and axiomatic-deductive mathematical modeling that makes up the core of mainstream economics is hard to make compatible with a real-world ontology. It is also the reason why so many critics find mainstream economic analysis patently and utterly unrealistic and irrelevant. The empty formalism that Solow points at in his critique of ‘modern’ macroeconomics is still one of the main reasons behind the monumental failure of ‘modern’ macroeconomics.

« Previous PageNext Page »

Create a free website or blog at WordPress.com.
Entries and comments feeds.