Greetings from Hamburg (personal)

26 July, 2016 at 19:50 | Posted in Varia | Leave a comment

speicherstadt

On our way down to Heidelberg we spent a couple of days in Hamburg, visiting Speicherstadt, the largest warehouse district in the world on timber-pile foundations, and since 2015 part of UNESCO World Heritage.

Awesome.

Econometric forecasting — an assessment

26 July, 2016 at 16:44 | Posted in Statistics & Econometrics | Leave a comment

411e9aO5PCL._SY344_BO1,204,203,200_There have been over four decades of econometric research on business cycles … The formalization has undeniably improved the scientific strength of business cycle measures …

But the significance of the formalization becomes more difficult to identify when it is assessed from the applied perspective, especially when the success rate in ex-ante forecasts of recessions is used as a key criterion. The fact that the onset of the 2008 financial-crisis-triggered recession was predicted by only a few ‘Wise Owls’ … while missed by regular forecasters armed with various models serves us as the latest warning that the efficiency of the formalization might be far from optimal. Remarkably, not only has the performance of time-series data-driven econometric models been off the track this time, so has that of the whole bunch of theory-rich macro dynamic models developed in the wake of the rational expectations movement, which derived its fame mainly from exploiting the forecast failures of the macro-econometric models of the mid-1970s recession.

The limits of econometric forecasting has, as noted by Qin, been critically pointed out many times before.

Trygve Haavelmo — with the completion (in 1958) of the twenty-fifth volume of Econometrica — assessed the the role of econometrics in the advancement of economics, and although mainly positive of the “repair work” and “clearing-up work” done, Haavelmo also found some grounds for despair:

We have found certain general principles which would seem to make good sense. Essentially, these principles are based on the reasonable idea that, if an economic model is in fact “correct” or “true,” we can say something a priori about the way in which the data emerging from it must behave. We can say something, a priori, about whether it is theoretically possible to estimate the parameters involved. And we can decide, a priori, what the proper estimation procedure should be … But the concrete results of these efforts have often been a seemingly lower degree of accuracy of the would-be economic laws (i.e., larger residuals), or coefficients that seem a priori less reasonable than those obtained by using cruder or clearly inconsistent methods.

Haavelmo-intro-2-125397_630x210There is the possibility that the more stringent methods we have been striving to develop have actually opened our eyes to recognize a plain fact: viz., that the “laws” of economics are not very accurate in the sense of a close fit, and that we have been living in a dream-world of large but somewhat superficial or spurious correlations.

And as the quote below shows, even Ragnar Frisch shared some of Haavelmo’s — and Keynes’s — doubts on the applicability of econometrics:

sp9997db.hovedspalteI have personally always been skeptical of the possibility of making macroeconomic predictions about the development that will follow on the basis of given initial conditions … I have believed that the analytical work will give higher yields – now and in the near future – if they become applied in macroeconomic decision models where the line of thought is the following: “If this or that policy is made, and these conditions are met in the period under consideration, probably a tendency to go in this or that direction is created”.

Ragnar Frisch

Econometrics may be an informative tool for research. But if its practitioners do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics. Maintaining that economics is a science in the “true knowledge” business, I remain a skeptic of the pretences and aspirations of econometrics. So far, I cannot really see that it has yielded very much in terms of relevant, interesting economic knowledge. And, more specifically,  when it comes to forecasting activities, the results have been bleak indeed.

The marginal return on its ever higher technical sophistication in no way makes up for the lack of serious under-labouring of its deeper philosophical and methodological foundations that already Keynes complained about. The rather one-sided emphasis of usefulness and its concomitant instrumentalist justification cannot hide that the legions of probabilistic econometricians who give supportive evidence for their considering it “fruitful to believe” in the possibility of treating unique economic data as the observable results of random drawings from an imaginary sampling of an imaginary population, are scating on thin ice. After having analyzed some of its ontological and epistemological foundations, I cannot but conclude that econometrics on the whole has not delivered “truth,” nor robust forecasts. And I doubt if it has ever been the intention of its main protagonists.

Our admiration for technical virtuosity should not blind us to the fact that we have to have a more cautious attitude towards probabilistic inference of causality in economic contexts. Science should help us penetrate to — as Keynes put it — “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts.”  We should look out for causal relations, but econometrics can never be more than a starting point in that endeavour, since econometric (statistical) explanations are not explanations in terms of mechanisms, powers, capacities or causes. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality, But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes.

A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. A perusal of the leading econom(etr)ic journals shows that most econometricians still concentrate on fixed parameter models and that parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

This is a more fundamental and radical problem than the celebrated “Lucas critique” have suggested. This is not the question if deep parameters, absent on the macro-level, exist in “tastes” and “technology” on the micro-level. It goes deeper. Real world social systems are not governed by stable causal mechanisms or capacities. It is the criticism that Keynes — in Essays in Biography — first launched against econometrics and inferential statistics already in the 1920s:

The atomic hypothesis which has worked so splendidly in Physics breaks down in Psychics. We are faced at every turn with the problems of Organic Unity, of Discreteness, of Discontinuity – the whole is not equal to the sum of the parts, comparisons of quantity fails us, small changes produce large effects, the assumptions of a uniform and homogeneous continuum are not satisfied. Thus the results of Mathematical Psychics turn out to be derivative, not fundamental, indexes, not measurements, first approximations at the best; and fallible indexes, dubious approximations at that, with much doubt added as to what, if anything, they are indexes or approximations of.

The kinds of laws and relations that econom(etr)ics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics – as most of contemporary endeavours of economic theoretical modeling – rather useless.

Austerity policies — nothing but kindergarten economics

25 July, 2016 at 18:51 | Posted in Economics | 2 Comments


I definitely recommend everyone to watch this well-argued interview with Steve Keen.

To many conservative and neoliberal politicians and economists there seems to be a spectre haunting the United States and Europe today — Keynesian ideas on governments pursuing policies raising effective demand and supporting employment. And some of the favourite arguments used among these Keynesophobics to fight it are the confidence argument and the doctrine of ‘sound finance.’

Is this witless crusade against economic reason new? Not at all!

kale It should be first stated that, although most economists are now agreed that full employment may be achieved by government spending, this was by no means the case even in the recent past. Among the opposers of this doctrine there were (and still are) prominent so-called ‘economic experts’ closely connected with banking and industry. This suggests that there is a political background in the opposition to the full employment doctrine, even though the arguments advanced are economic. That is not to say that people who advance them do not believe in their economics, poor though this is. But obstinate ignorance is usually a manifestation of underlying political motives …

Clearly, higher output and employment benefit not only workers but entrepreneurs as well, because the latter’s profits rise. And the policy of full employment outlined above does not encroach upon profits because it does not involve any additional taxation. The entrepreneurs in the slump are longing for a boom; why do they not gladly accept the synthetic boom which the government is able to offer them? It is this difficult and fascinating question with which we intend to deal in this article …

We shall deal first with the reluctance of the ‘captains of industry’ to accept government intervention in the matter of employment. Every widening of state activity is looked upon by business with suspicion, but the creation of employment by government spending has a special aspect which makes the opposition particularly intense. Under a laissez-faire system the level of employment depends to a great extent on the so-called state of confidence. If this deteriorates, private investment declines, which results in a fall of output and employment (both directly and through the secondary effect of the fall in incomes upon consumption and investment). This gives the capitalists a powerful indirect control over government policy: everything which may shake the state of confidence must be carefully avoided because it would cause an economic crisis. But once the government learns the trick of increasing employment by its own purchases, this powerful controlling device loses its effectiveness. Hence budget deficits necessary to carry out government intervention must be regarded as perilous. The social function of the doctrine of ‘sound finance’ is to make the level of employment dependent on the state of confidence.

Michal Kalecki Political aspects of full employment  (1943)

Good reasons to worry about inequalities

25 July, 2016 at 13:22 | Posted in Economics, Politics & Society | 1 Comment

Focussing upon inequality statistics … misses an important point. What matters is not just the level of income inequality, but how that inequality arose. A free market society in which high incomes arise from the free choices of consenting adults – as in Robert Nozick’s Wilt Chamberlain parable – might have the same Gini coefficient as a crony capitalist society. But they are two different things. A good reason to be worried about current inequality – even if it hasn’t changed – is that it is a symptom of market failures such as corporate welfare, regulatory capture or the implicit subsidy to banks.

trickle-downIn this context, what matters is not just inequalities of income but inequalities of power. Top footballers and top bankers might be earning similar sums, but one’s salary is the product of market forces and the other of a tax-payer subsidy. The freelancer on £30,000 who’s worrying where his next contract is coming from has similar income to the bullying middle managers who created intolerable working conditions at (for example) Sports Direct. But they have very different degrees of economic power. And the low income that results from having to take a lousy job where your wages are topped up by tax credits gives you much less power than the same income that would come from a basic income and the freer choice to take or leave a low wage job.

My point here is a simple one. There are very good reasons why we should worry about inequality – not just leftists but also rightists who want freer markets and “bourgeois” virtues. Focusing only upon the stability of the Gini coefficient is a form of statistical fetishism which overlooks important questions.

Chris Dillow

Lucas-Rapping and ‘New Keynesian’ models of unemployment

25 July, 2016 at 12:06 | Posted in Economics | 2 Comments

unemployed-thumbLucas and Rapping (1969) claim that cyclical increases in unemployment occur when workers quit their jobs because wages or salaries fall below expectations …

According to this explanation, when wages are unusually low, people become unemployed in order to enjoy free time, substituting leisure for income at a time when they lose the least income …

According to the theory, quits into unemployment increase during recessions, whereas historically quits decrease sharply and roughly half of unremployed workers become jobless because they are laid off … During the recession I studied, people were even afraid to change jobs because new ones might prove unstable and lead to unemployment …

If wages and salaries hardly ever fall, the intertemporal substitution theory is widely applicable only if the unemployed prefer jobless leisure to continued employment at their old pay. However, the attitude and circumstances of the unemployed are not consistent with their having made this choice …

In real business cycle theory, unemployment is interpreted as leisure optimally selected by workers, as in the Lucas-Rapping model. It has proved difficult to construct business cycle models consistent with this assumption and with real wage fluctuations as small as they are in reality, relative to fluctuations in employment.

Truman F. Bewley

This is, of course, only what you would expect of New Classical Chicago economists.

But sadly enough this extraterrestial view of unemployment is actually shared by so called New Keynesians, whose microfounded dynamic stochastic general equilibrium models cannot even incorporate such a basic fact of reality as involuntary unemployment!

Of course, working with microfunded representative agent models, this should come as no surprise. If one representative agent is employed, all representative agents are. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility. In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be.

If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

To Keynes this was self-evident. But obviously not so to New Classical and ‘New Keynesian’ economists.

Thorbjörn Fälldin — en gigant har gått ur tiden

25 July, 2016 at 10:48 | Posted in Politics & Society | Leave a comment

torbjörn

Thorbjörn Fälldin, Sveriges tidigare statsminister och ledare för Centerpartiet, dog i sitt hem i Ramvik, på lördagskvällen, 90 år gammal.

Under Thorbjörn Fälldins och föregångaren Gunnar Hedlunds tid representerade Centerpartiet fortfarande en genuin politisk kraft i vårt samhälle.

Efter Fälldin vände Centerpartiet blad och så småningom fick vi Maud Olofsson.

Med Olofsson såldes partiets själ ut och partiets kräftgång in i nyliberalism påbörjades.

Sen kom Annie Lööf. Och ett en gång respekterat parti kördes fullständigt i botten.

Att en nyliberal, tyckmyckentrutad, floskulös, Ayn Rand och Margaret Thatcher dyrkande broilerpolitiker idag kan sitta och styra över ett parti som letts av giganter som Hedlund och Fälldin är fullständigt monstruöst. Obegripligt. Och sorgligt.

New Keynesian unemployment — a paid vacation essentially!

24 July, 2016 at 10:45 | Posted in Economics | Leave a comment

Franco Modigliani famously quipped that he did not think that unemployment during the Great Depression should be described, in an economic model, as a “sudden bout of contagious laziness”. Quite. For the past thirty years we have been debating whether to use classical real business cycle models (RBC), or their close cousins, modern New Keynesian (NK) models, to describe recessions. In both of these models, the social cost of persistent unemployment is less than a half a percentage point of steady state consumption.

0a7fb63c47d95f3138a81e711dabe9d3959138340aa3e78d26336fd2fab0f6b9What does that mean? Median US consumption is roughly $30,000 a year. One half of one percent of this is roughly 50 cents a day. A person inhabiting one of our artificial model RBC or NK model worlds, would not be willing to pay more than 50 cents a day to avoid another Great Depression. That is true of real business cycle models. It is also true of New Keynesian models …

That’s why I eschew NK and RBC models. They are both wrong. The high unemployment that follows a financial crisis is not the socially efficient response to technology shocks. And the slow recovery from a financial melt-down has nothing to do with the costs of reprinting menus that underpins the models of NK economists. It is a potentially permanent failure of private agents to coordinate on an outcome that is socially desirable.

Roger Farmer

wpid-mmb9qajq9swpi8xxy76a

In the basic DSGE models used by both New Classical and ‘New Keynesian’ macroeconomists, the labour market is always cleared – responding to a changing interest rate, expected life time incomes, or real wages, the representative agent maximizes the utility function by varying her labour supply, money holding and consumption over time. Most importantly – if the real wage somehow deviates from its ‘equilibrium value,’ the representative agent adjust her labour supply, so that when the real wage is higher than its ‘equilibrium value,’ labour supply is increased, and when the real wage is below its ‘equilibrium value,’ labour supply is decreased.

In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be — a kind of prolonged vacation.

The D WordAlthough this picture of unemployment as a kind of self-chosen optimality, strikes most people as utterly ridiculous, there are also, unfortunately, a lot of neoclassical economists out there who still think that price and wage rigidities are the prime movers behind unemployment. DSGE models basically explains variations in employment (and a fortiori output) with assuming nominal wages being more flexible than prices – disregarding the lack of empirical evidence for this rather counterintuitive assumption.

Lowering nominal wages would not  clear the labour market. Lowering wages – and possibly prices – could, perhaps, lower interest rates and increase investment. It would be much easier to achieve that effect by increasing the money supply. In any case, wage reductions was not seen as a general substitute for an expansionary monetary or fiscal policy. And even if potentially positive impacts of lowering wages exist, there are also more heavily weighing negative impacts – management-union relations deteriorating, expectations of on-going lowering of wages causing delay of investments, debt deflation et cetera.

The classical proposition that lowering wages would lower unemployment and ultimately take economies out of depressions, was ill-founded and basically wrong. Flexible wages would probably only make things worse by leading to erratic price-fluctuations. The basic explanation for unemployment is insufficient aggregate demand, and that is mostly determined outside the labour market.

Obviously it’s rather embarrassing that the kind of DSGE models ‘modern’ macroeconomists use cannot incorporate such a basic fact of reality as involuntary unemployment. Of course, working with representative agent models, this should come as no surprise. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility.

And as if this is nonsense economics is not enough, in New Classical and ‘New Keynesian’ macroeconomists DSGE models increases in government spending leads to a drop in private consumption!

How on earth does one arrive at such as bizarre view?

In the most basic mainstream proto-DSGE models one often assumes that governments finance current expenditures with current tax revenues.  This will have a negative income effect on the households, leading — rather counterintuitively — to a drop in private consumption although both employment an production expands. This mechanism also holds when the (in)famous Ricardian equivalence is added to the models.

Ricardian equivalence basically means that financing government expenditures through taxes or debts is equivalent, since debt financing must be repaid with interest, and agents — equipped with rational expectations — would only increase savings in order to be able to pay the higher taxes in the future, thus leaving total expenditures unchanged.

Why?

In the standard neoclassical consumption model — used in DSGE macroeconomic modeling — people are basically portrayed as treating time as a dichotomous phenomenon  today and the future — when contemplating making decisions and acting. How much should one consume today and how much in the future? Facing an intertemporal budget constraint of the form

ct + cf/(1+r) = ft + yt + yf/(1+r),

where ct is consumption today, cf is consumption in the future, ft is holdings of financial assets today, yt is labour incomes today, yf is labour incomes in the future, and r is the real interest rate, and having a lifetime utility function of the form

U = u(ct) + au(cf),

where a is the time discounting parameter, the representative agent (consumer) maximizes his utility when

u'(ct) = a(1+r)u'(cf).

This expression – the Euler equation – implies that the representative agent (consumer) is indifferent between consuming one more unit today or instead consuming it tomorrow. Typically using a logarithmic function form – u(c) = log c – which gives u'(c) = 1/c, the Euler equation can be rewritten as

1/ct = a(1+r)(1/cf),

or

cf/ct = a(1+r).

This importantly implies that according to the neoclassical consumption model changes in the (real) interest rate and consumption move in the same direction. And — it also follows that consumption is invariant to the timing of taxes, since wealth — ft + yt + yf/(1+r) — has to be interpreted as present discounted value net of taxes. And so, according to the assumption of Ricardian equivalence, the timing of taxes does not affect consumption, simply because the maximization problem as specified in the model is unchanged. As a result — households cut down on their consumption when governments increase their spendings. Mirabile dictu!

out of the fryingMacroeconomic models have to abandon Ricardian equivalence nonsense. But replacing it with “overlapping generations” and “infinite-horizon” models — is– in terms of realism and relevance — just getting out of the frying pan into the fire. All unemployment is still voluntary. Intertemporal substitution between labour and leisure is still ubiquitous. And the specification of the utility function is still hopelessly off the mark from an empirical point of view.

As one Nobel laureate had it:

Ricardian equivalence is taught in every graduate school in the country. It is also sheer nonsense.

Joseph E. Stiglitz, twitter 

And as one economics blogger has it:

New Classical and ‘New Keynesian’ DSGE modeling is taught in every graduate school in the country. It is also sheer nonsense.

Lars P Syll, twitter 

Economics vs. reality

24 July, 2016 at 09:41 | Posted in Economics | 1 Comment

Economics_Versus_Reality-e1431153441673Denicolò and Zanchettin, in an article published by the prestigious Economic Journal, claim to have shown among other things that “stronger patent protection may reduce innovation and growth.” As a prelude to forty pages of mathematics, they state of their model, “The economy is populated by L identical, infinitely lived, individuals … There is a unique final good in the economy that can be consumed, used to produce intermediate goods, or used in research …” Not only are all the people in this model world identical and immortal, they only produce a single product. The product has properties that are entirely unreal—not so much science fiction as pure magic. The conclusion may be justified, or not; but the idea that a model so remote from reality can be used to make public policy recommendations is, to anyone but a fully certified neoclassical economist, staggering.

The mathematization of economics since WW II has made mainstream — neoclassical — economists more or less obsessed with formal, deductive-axiomatic models. Confronted with the critique that they do not solve real problems, they  often react as Saint-Exupéry‘s Great Geographer, who, in response to the questions posed by The Little Prince, says that he is too occupied with his scientific work to be be able to say anything about reality. Confronting economic theory’s lack of relevance and ability to tackle real probems, one retreats into the wonderful world of economic models. One goes in to the “shack of tools” — as my old mentor Erik Dahmén used to say — and stays there. While the economic problems in the world around us steadily increase, one is rather happily playing along with the latest toys in the mathematical toolbox.

Modern mainstream economics is sure very rigorous — but if it’s rigorously wrong, who cares?

Instead of making formal logical argumentation based on deductive-axiomatic models the message, I think we are better served by economists who more  than anything else try to contribute to solving real problems. And then the motto of John Maynard Keynes is more valid than ever:

It is better to be vaguely right than precisely wrong

Racial bias in police shooting

23 July, 2016 at 18:23 | Posted in Politics & Society, Statistics & Econometrics | 3 Comments

roland-fryerRoland Fryer, an economics professor at Harvard University, recently published a working paper at NBER on the topic of racial bias in police use of force and police shootings. The paper gained substantial media attention – a write-up of it became the top viewed article on the New York Times website. The most notable part of the study was its finding that there was no evidence of racial bias in police shootings, which Fryer called “the most surprising result of [his] career”. In his analysis of shootings in Houston, Texas, black and Hispanic people were no more likely (and perhaps even less likely) to be shot relative to whites.

Fryer’s analysis is highly flawed, however … Fryer was not comparing rates of police shootings by race. Instead, his research asked whether these racial differences were the result of “racial bias” rather than merely “statistical discrimination”. Both terms have specific meanings in economics. Statistical discrimination occurs when an individual or institution treats people differently based on racial stereotypes that ‘truly’ reflect the average behavior of a racial group. For instance, if a city’s black drivers are 50% more likely to possess drugs than white drivers, and police officers are 50% more likely to pull over black drivers, economic theory would hold that this discriminatory policing is rational …

Once explained, it is possible to find the idea of “statistical discrimination” just as abhorrent as “racial bias”. One could point out that the drug laws police enforce were passed with racially discriminatory intent, that collectively punishing black people based on “average behavior” is wrong, or that – as a self-fulfilling prophecy – bias can turn into statistical discrimination (if black people’s cars are searched more thoroughly, for instance, it will appear that their rates of drug possession are higher) …

Even if one accepts the logic of statistical discrimination versus racial bias, it is an inappropriate choice for a study of police shootings. The method that Fryer employs has, for the most part, been used to study traffic stops and stop-and-frisk practices. In those cases, economic theory holds that police want to maximize the number of arrests for the possession of contraband (such as drugs or weapons) while expending the fewest resources. If they are acting in the most cost-efficient, rational manner, the officers may use racial stereotypes to increase the arrest rate per stop. This theory completely falls apart for police shootings, however, because officers are not trying to rationally maximize the number of shootings …

Economic theory aside, there is an even more fundamental problem with the Houston police shooting analysis. In a typical study, a researcher will start with a previously defined population where each individual is at risk of a particular outcome. For instance, a population of drivers stopped by police can have one of two outcomes: they can be arrested, or they can be sent on their way. Instead of following this standard approach, Fryer constructs a fictitious population of people who are shot by police and people who are arrested. The problem here is that these two groups (those shot and those arrested) are, in all likelihood, systematically different from one another in ways that cannot be controlled for statistically … Properly interpreted, the actual result from Fryer’s analysis is that the racial disparity in arrest rates is larger than the racial disparity in police shootings. This is an unsurprising finding, and proves neither a lack of bias nor a lack of systematic discrimination.

Justin Feldman

What makes most econometric models invalid

23 July, 2016 at 10:42 | Posted in Statistics & Econometrics | Leave a comment

The assumption of additivity and linearity means that the outcome variable is, in reality, linearly related to any predictors … and that if you have several predictors then their combined effect is best described by adding their effects together …

catdogThis assumption is the most important because if it is not true then even if all other assumptions are met, your model is invalid because you have described it incorrectly. It’s a bit like calling your pet cat a dog: you can try to get it to go in a kennel, or to fetch sticks, or to sit when you tell it to, but don’t be surprised when its behaviour isn’t what you expect because even though you’ve called it a dog, it is in fact a cat. Similarly, if you have described your statistical model inaccurately it won’t behave itself and there’s no point in interpreting its parameter estimates or worrying about significance tests of confidence intervals: the model is wrong.

Andy Field

Next Page »

Create a free website or blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.