The Lucas Critique is but a shallow version of the Keynes Critique (wonkish)

23 Jul, 2012 at 12:23 | Posted in Economics, Theory of Science & Methodology | 6 Comments

If we are going to be able to show that the mechanisms or causes that we isolate and handle in our models are stable in the sense that they do not change when we “export” them to our “target systems”, they do only hold under ceteris paribus conditions and are a fortiori of limited value to our understanding, explanations or predictions of real economic systems. Or as the always eminently quotable Keynes wrote in Treatise on Probability(1921):

The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be [that] the system of the material universe must consist of bodies … such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state … Yet there might well be quite different laws for wholes of different degrees of complexity, and laws of connection between complexes which could not be stated in terms of laws connecting individual parts … If different wholes were subject to different laws qua wholes and not simply on account of and in proportion to the differences of their parts, knowledge of a part could not lead, it would seem, even to presumptive or probable knowledge as to its association with other parts … These considerations do not show us a way by which we can justify induction … /427 No one supposes that a good induction can be arrived at merely by counting cases. The business of strengthening the argument chiefly consists in determining whether the alleged association is stable, when accompanying conditions are varied … /468 In my judgment, the practical usefulness of those modes of inference … on which the boasted knowledge of modern science depends, can only exist … if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appears more and more clearly as the ultimate result to which material science is tending.

Econometrics may be an informative tool for research. But if its practitioners do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics. Maintaining that economics is a science in the “true knowledge” business, I remain a skeptic of the pretences and aspirations of econometrics. So far, I cannot really see that it has yielded very much in terms of relevant, interesting economic knowledge.

The marginal return on its ever higher technical sophistication in no way makes up for the lack of serious under-labouring of its deeper philosophical and methodological foundations that already Keynes complained about. The rather one-sided emphasis of usefulness and its concomitant instrumentalist justification cannot hide that neither Haavelmo, nor the legions of probabilistic econometricians following in his footsteps, give supportive evidence for their considering it “fruitful to believe” in the possibility of treating unique economic data as the observable results of random drawings from an imaginary sampling of an imaginary population. After having analyzed some of its ontological and epistemological foundations, I cannot but conclude that econometrics on the whole has not delivered “truth”. And I doubt if it has ever been the intention of its main protagonists.

Our admiration for technical virtuosity should not blind us to the fact that we have to have a cautious attitude towards probabilistic inferences in economic contexts. Science should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts” [Keynes 1971-89 vol XVII:427]. We should look out for causal relations, but econometrics can never be more than a starting point in that endeavour, since econometric (statistical) explanations are not explanations in terms of mechanisms, powers, capacities or causes. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality, But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable and non-additive not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes. A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. A perusal of the leading econom(etr)ic journals shows that most econometricians still concentrate on fixed parameter models and that parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

This is a more fundamental and radical problem than the celebrated “Lucas critique” have suggested.This is not the question if deep parameters, absent on the macro-level, exist in “tastes” and “technology” on the micro-level. It goes deeper. Real world social systems are not governed by stable causal mechanisms or capacities. It is the criticism that Keynes first launched against econometrics and inferential statistics already in the 1920s:

The atomic hypothesis which has worked so splendidly in Physics breaks down in Psychics. We are faced at every turn with the problems of Organic Unity, of Discreteness, of Discontinuity – the whole is not equal to the sum of the parts, comparisons of quantity fails us, small changes produce large effects, the assumptions of a uniform and homogeneous continuum are not satisfied. Thus the results of Mathematical Psychics turn out to be derivative, not fundamental, indexes, not measurements, first approximations at the best; and fallible indexes, dubious approximations at that, with much doubt added as to what, if anything, they are indexes or approximations of.

The kinds of laws and relations that econom(etr)ics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomistic and additive. When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics – as most of contemporary endeavours of economic theoretical modeling – rather useless.

Both the “Lucas critique” and the “Keynes’ critique” of econometrics argued that it was inadmissible to project history on the future. Consequently an economic policy cannot presuppose that what has worked before, will continue to do so in the future. That macroeconomic models could get hold of correlations between different “variables” was not enough. If they could not get at the causal structure that generated the data, they were not really “identified”. Lucas himself drew the conclusion that the problem with unstable relations was to construct models with clear microfoundations where forward-looking optimizing individuals and robust, deep, behavioural parameters are seen to be stable even to changes in economic policies. As yours truly has argued in a couple of post (e. g. here and here), this, however, is a dead end.

O, horrible! O, horrible! most horrible!

23 Jul, 2012 at 10:03 | Posted in Economics, Politics & Society | Comments Off on O, horrible! O, horrible! most horrible!

George Monbiot has a magnificent article in The Guardian  on the perversion of the concept of freedom that neoliberals and libertarians are trying to bring about. A must read for everyone – but perhaps especially for market fundamentalist sweetwater economists and right wing think tanks with their Panglossian views on “efficient markets”:

Freedom: who could object? Yet this word is now used to justify a thousand forms of exploitation. Throughout the rightwing press and blogosphere, among thinktanks and governments, the word excuses every assault on the lives of the poor, every form of inequality and intrusion to which the 1% subject us. How did libertarianism, once a noble impulse, become synonymous with injustice?

In the name of freedom – freedom from regulation – the banks were permitted to wreck the economy. In the name of freedom, taxes for the super-rich are cut. In the name of freedom, companies lobby to drop the minimum wage and raise working hours. In the same cause, US insurers lobby Congress to thwart effective public healthcare; the government rips up our planning laws; big business trashes the biosphere. This is the freedom of the powerful to exploit the weak, the rich to exploit the poor.

Rightwing libertarianism recognises few legitimate constraints on the power to act, regardless of the impact on the lives of others … [Its] concept of freedom looks to me like nothing but a justification for greed.

So why have we been so slow to challenge this concept of liberty? I believe that one of the reasons is as follows. The great political conflict of our age – between neocons and the millionaires and corporations they support on one side, and social justice campaigners and environmentalists on the other – has been mischaracterised as a clash between negative and positive freedoms. These freedoms were most clearly defined by Isaiah Berlin in his essay of 1958, Two Concepts of Liberty. It is a work of beauty: reading it is like listening to a gloriously crafted piece of music. I will try not to mangle it too badly.

Put briefly and crudely, negative freedom is the freedom to be or to act without interference from other people. Positive freedom is freedom from inhibition: it’s the power gained by transcending social or psychological constraints. Berlin explained how positive freedom had been abused by tyrannies, particularly by the Soviet Union. It portrayed its brutal governance as the empowerment of the people, who could achieve a higher freedom by subordinating themselves to a collective single will.

Rightwing libertarians claim that greens and social justice campaigners are closet communists trying to resurrect Soviet conceptions of positive freedom. In reality, the battle mostly consists of a clash between negative freedoms.

As Berlin noted: “No man’s activity is so completely private as never to obstruct the lives of others in any way. ‘Freedom for the pike is death for the minnows’.” So, he argued, some people’s freedom must sometimes be curtailed “to secure the freedom of others”. In other words, your freedom to swing your fist ends where my nose begins. The negative freedom not to have our noses punched is the freedom that green and social justice campaigns, exemplified by the Occupy movement, exist to defend.

Berlin also shows that freedom can intrude on other values, such as justice, equality or human happiness. “If the liberty of myself or my class or nation depends on the misery of a number of other human beings, the system which promotes this is unjust and immoral.” It follows that the state should impose legal restraints on freedoms that interfere with other people’s freedoms – or on freedoms which conflict with justice and humanity …

But rightwing libertarians do not recognise this conflict. They speak … as if the same freedom affects everybody in the same way. Theyassert their freedom to pollute, exploit, even – among the gun nuts – to kill, as if these were fundamental human rights. They characterise any attempt to restrain them as tyranny. They refuse to see that there is a clash between the freedom of the pike and the freedom of the minnow …

Modern libertarianism is the disguise adopted by those who wish to exploit without restraint. It pretends that only the state intrudes on our liberties. It ignores the role of banks, corporations and the rich in making us less free. It denies the need for the state to curb them in order to protect the freedoms of weaker people. This bastardised, one-eyed philosophy is a con trick, whose promoters attempt to wrongfoot justice by pitching it against liberty. By this means they have turned “freedom” into an instrument of oppression.

Sticky wages and the transmogrification of truth

22 Jul, 2012 at 22:25 | Posted in Varia | Comments Off on Sticky wages and the transmogrification of truth

In a post on his blog today Paul Krugman writes on a truly important methodological question in economics:

I’ve written quite a lot about sticky wages, aka downward nominal wage rigidity, which is one of those things that we can’t derive from first principles but is a glaringly obvious feature of the real world. But I keep running into comments along the lines of “Well, if you think sticky wages are the problem, why aren’t you calling for wage cuts?”

This is a category error. It confuses the question “What do we need to make sense of what we see?” with the question “What is the problem?”

So right, so right. I only wish this knowledge also found its way in to the standard economics textbooks.
Among intermediate neoclassical macroeconomics textbooks, Chad Jones textbook Macroeconomics (2nd ed, W W Norton, 2011) stands out as perhaps one of the better alternatives, by combining more traditional short-run macroeconomic analysis with an accessible coverage of the Romer model – the foundation of modern growth theory.

Unfortunately it also contains some utter nonsense!

In a chapter on “The Labor Market, Wages, and Unemployment” Jones writes:

The point of this experiment is to show that wage rigidities can lead to large movements in employment. Indeed, they are the reason John Maynard Keynes gave, in The General Theory of Employment, Interest, and Money (1936), for the high unemployment of the Great Depression.

This is of course pure nonsense. For although Keynes in General Theory devoted substantial attention to the subject of wage rigidities, he certainly did not hold the view that wage rigidities were “the reason … for the high unemployment of the Great Depression.”

Since unions/workers, contrary to classical assumptions, make wage-bargains in nominal terms, they will – according to Keynes – accept lower real wages caused by higher prices, but resist lower real wages caused by lower nominal wages. However, Keynes held it incorrect to attribute “cyclical” unemployment to this diversified agent behaviour. During the depression money wages fell significantly and – as Keynes noted – unemployment still grew. Thus, even when nominal wages are lowered, they do not generally lower unemployment.

In any specific labour market, lower wages could, of course, raise the demand for labour. But a general reduction in money wages would leave real wages more or less unchanged. The reasoning of the classical economists was, according to Keynes, a flagrant example of the “fallacy of composition.” Assuming that since unions/workers in a specific labour market could negotiate real wage reductions via lowering nominal wages, unions/workers in general could do the same, the classics confused micro with macro.

Lowering nominal wages could not – according to Keynes – clear the labour market. Lowering wages – and possibly prices – could, perhaps, lower interest rates and increase investment. But to Keynes it would be much easier to achieve that effect by increasing the money supply. In any case, wage reductions was not seen by Keynes as a general substitute for an expansionary monetary or fiscal policy.

Even if potentially positive impacts of lowering wages exist, there are also more heavily weighing negative impacts – management-union relations deteriorating, expectations of on-going lowering of wages causing delay of investments, debt deflation et cetera.

So, what Keynes actually did argue in General Theory, was that the classical proposition that lowering wages would lower unemployment and ultimately take economies out of depressions, was ill-founded and basically wrong.

To Keynes, flexible wages would only make things worse by leading to erratic price-fluctuations. The basic explanation for unemployment is insufficient aggregate demand, and that is mostly determined outside the labor market.

To mainstream neoclassical theory the kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility. Keynes on the other hand writes in General Theory:

The classical school [maintains that] while the demand for labour at the existing money-wage may be satisfied before everyone willing to work at this wage is employed, this situation is due to an open or tacit agreement amongst workers not to work for less, and that if labour as a whole would agree to a reduction of money-wages more employment would be forthcoming. If this is the case, such unemployment, though apparently involuntary, is not strictly so, and ought to be included under the above category of ‘voluntary’ unemployment due to the effects of collective bargaining, etc …

The classical theory … is best regarded as a theory of distribution in conditions of full employment. So long as the classical postulates hold good, unemployment, which is in the above sense involuntary, cannot occur. Apparent unemployment must, therefore, be the result either of temporary loss of work of the ‘between jobs’ type or of intermittent demand for highly specialised resources or of the effect of a trade union ‘closed shop’ on the employment of free labour. Thus writers in the classical tradition, overlooking the special assumption underlying their theory, have been driven inevitably to the conclusion, perfectly logical on their assumption, that apparent unemployment (apart from the admitted exceptions) must be due at bottom to a refusal by the unemployed factors to accept a reward which corresponds to their marginal productivity …

Obviously, however, if the classical theory is only applicable to the case of full employment, it is fallacious to apply it to the problems of involuntary unemployment – if there be such a thing (and who will deny it?). The classical theorists resemble Euclidean geometers in a non-Euclidean world who, discovering that in experience straight lines apparently parallel often meet, rebuke the lines for not keeping straight – as the only remedy for the unfortunate collisions which are occurring. Yet, in truth, there is no remedy except to throw over the axiom of parallels and to work out a non-Euclidean geometry. Something similar is required to-day in economics. We need to throw over the second postulate of the classical doctrine and to work out the behaviour of a system in which involuntary unemployment in the strict sense is possible.

Unfortunately, Jones macroeconomics textbook is not the only one containing this kind of utter nonsense on Keynes. Similar distortions of Keynes views can be found in , e. g., the economics textbooks of the “New Keynesian” – a grotesque misnomer – Greg Mankiw. How is this possible? Probably because these economists have but a very superficial acquaintance with Keynes own works, and rather depend on second-hand sources like Hansen, Samuelson, Hicks and the likes.

Fortunately there is a solution to the problem. Keynes books are still in print. Read them!!

Alternatives to the neoliberal mumbo jumbo of Greg Mankiw and Richard Epstein

22 Jul, 2012 at 17:55 | Posted in Economics, Politics & Society | 1 Comment

As a young research stipendiate in the U.S. at the beginning of the 1980s, yours truly had the great pleasure and privelege of participating at seminars and lectures with people like Paul Davidson, Hyman Minsky and Stephen Marglin.

They were great inspirations at the time. They still are.
 

The unknown knowns of modern macroeconomics

22 Jul, 2012 at 16:20 | Posted in Economics | 5 Comments

The financial crisis of 2007-08 hit most laymen and economists with surprise. What was it that went wrong with our macroeconomic models, since they obviously did not foresee the collapse or even make it conceivable?

The root of our problem ultimately goes back to how we look upon the data we are handling. In modern neoclassical macroeconomics – Dynamic Stochastic General Equilibrium (DSGE), New Synthesis, New Classical and “New Keynesian” – variables are treated as if drawn from a known “data-generating process” that unfolds over time and on which we therefore have access to heaps of historical time-series. If we do not assume that we know the “data-generating process” – if we do not have the “true” model – the whole edifice collapses.

Modern macroeconomics obviously did not anticipate the enormity of the problems that unregulated “efficient” financial markets created. Why? Because it builds on the myth of us knowing the “data-generating process” and that we can describe the variables of our evolving economies as drawn from an urn containing stochastic probability functions with known means and variances.

This is like saying that you are going on a holiday-trip and that you know that the chance the weather being sunny is at least 30%, and that this is enough for you to decide on bringing along your sunglasses or not. You are supposed to be able to calculate the expected utility based on the given probability of sunny weather and make a simple decision of either-or. Uncertainty is reduced to risk.

But this is not always possible. Often we simply do not know. According to one model the chance of sunny weather is perhaps somewhere around 10% and according to another – equally good – model the chance is perhaps somewhere around 40%. We cannot put exact numbers on these assessments. We cannot calculate means and variances. There are no given probability distributions that we can appeal to.

In the end this is what it all boils down to. We all know that many activities, relations, processes and events are of the Keynesian uncertainty type. The data do not unequivocally single out one decision as the only “rational” one. Neither the economist, nor the deciding individual, can fully pre-specify how people will decide when facing uncertainties and ambiguities that are ontological facts of the way the world works.

Some macroeconomists, however, still want to be able to use their hammer. So they decide to pretend that the world looks like a nail, and pretend that uncertainty can be reduced to risk. So they construct their mathematical models on that assumption. The result: financial crises and economic havoc.

How much better – how much bigger chance that we do not lull us into the comforting thought that we know everything and that everything is measurable and we have everything under control – if instead we would just admit that we often simply do not know, and that we have to live with that uncertainty as well as it goes.

Fooling people into believing that one can cope with an unknown economic future in a way similar to playing at the roulette wheels, is a sure recipe for only one thing – economic catastrophy!

The unknown knowns – the things we fool ourselves to believe we know – often have more dangerous repercussions than the “Black Swans” of Knightian unknown unknowns, something quantitative risk management based on the hypothesis of market efficiency and rational expectations has given ample evidence of during the latest financial crisis.

Please say after me – Sonnenschein-Mantel-Debreu

21 Jul, 2012 at 10:53 | Posted in Economics | 23 Comments

Can you say Sonnenschein-Mantel-Debreu?

Good!

Because that probably also means that you can understand why New Classical, Real Business Cycles, Dynamic Stochastic General Equilibrium (DSGE) and “New Keynesian” microfounded macromodels are such bad substitutes for real macroeconomic analysis.

These models try to describe and analyze complex and heterogeneous real economies with a single rational-expectations-robot-imitation-representative-agent. That is, with something that has absolutely nothing to do with reality. And – worse still – something that is not even amenable to the kind of general equilibrium analysis that they are thought to give a foundation for, since Hugo Sonnenschein (1972) , Rolf Mantel (1976) and Gerard Debreu (1974) unequivocally showed that there did not exist any condition by which assumptions on individuals would guarantee neither stability nor uniqueness of the equlibrium solution.

Opting for cloned representative agents that are all identical is of course not a real solution to the fallacy of composition that the Sonnenschein-Mantel-Debreu theorem points to. After all – as Nobel laureate Robert Solow noted in “The State of Macroeconomics” (Journal of Economic Perspectives 2008:243-249) – “a modern economy is populated by consumers, workers, pensioners, owners, managers, investors, entrepreneurs, bankers, and others, with different and sometimes conflicting desires, information, expectations, capacities, beliefs, and rules of behavior.” So, representative agent models are rather an evasion whereby issues of distribution, coordination, heterogeneity – everything that really defines macroeconomics – are swept under the rug.

Conclusion – don’t believe a single thing of what these microfounders tell you until they have told you how they have coped with – not evadedSonnenschein-Mantel-Debreu!

Of course, most macroeconomists know that to use a representative agent is a flagrantly illegitimate method of ignoring real aggregation issues. They keep on with their business, nevertheless, just because it significantly simplifies what they are doing. It reminds – not so little – of the drunkard who has lost his keys in some dark place and deliberately chooses to look for them under a neighbouring street light just because it is easier to see there!

Fooled by randomness

20 Jul, 2012 at 18:02 | Posted in Statistics & Econometrics | Comments Off on Fooled by randomness

A non-trivial part of teaching statistics to social science students is made up of learning them to perform significance testing. A problem I have noticed repeatedly over the years, however, is that no matter how careful you try to be in explicating what the probabilities generated by these statistical tests – p values – really are, still most students misinterpret them.

A couple of years ago I gave a statistics course for the Swedish National Research School in History, and at the exam I asked the students to explain how one should correctly interpret p-values. Although the correct definition is p(data|null hypothesis), a majority of the students either misinterpreted the p value as being the likelihood of a sampling error (which of course is wrong, since the very computation of the p value is based on the assumption that sampling errors are what causes the sample statistics not coinciding with the null hypothesis) or that the p value is the probability of the null hypothesis being true, given the data (which of course also is wrong, since it is p(null hypothesis|data) rather than the correct p(data|null hypothesis)).

This is not to blame on students’ ignorance, but rather on significance testing not being particularly transparent (conditional probability inference is difficult even to those of us who teach and practice it). A lot of researchers fall pray to the same mistakes. So – given that it anyway is very unlikely than any population parameter is exactly zero, and that contrary to assumption most samples in social science and economics are not random or having the right distributional shape – why continue to press students and researchers to do null hypothesis significance testing, testing that relies on weird backward logic that students and researchers usually don’t understand?

Let me just give a simple example to illustrate how slippery it is to deal with p-values – and how easy it is to impute causality to things that really are nothing but chance occurrences.

Say you have collected cross-country data on austerity policies and growth (and let’s assume that you have been able to “control” for possible confounders). You find that countries that have implemented austerity policies have on average increased their growth by say 2% more than the other countries. To really feel sure about the efficacy of the austerity policies you run a significance test – thereby actually assuming without argument that all the values you have come from the same probability distribution – and you get a p-value of  less tha 0.05. Heureka! You’ve got a statistically significant value. The probability is less than 1/20 that you got this value out of pure stochastic randomness.

But wait a minute. There is – as you may have guessed – a snag. If you test austerity policies in enough many countries you will get a statistically significant result out of pure chance 5% of the time. So, really, there is nothing to get so excited about!

Statistical significance doesn’t say that something is important or true. And since there already are far better and more relevant testing that can be done (see e. g. here and  here), it is high time to give up on this statistical fetish and not continue to be fooled by randomness.

Macroeconomic calibration – or why it is difficult for economists to take their own subject seriously (wonkish)

20 Jul, 2012 at 15:27 | Posted in Economics, Theory of Science & Methodology | 7 Comments

There are many kinds of useless economics held in high regard within mainstream economics establishment today . Few – if any – are less deserved than the macroeconomic theory – mostly connected with Nobel laureates Finn Kydland, Robert Lucas,  Edward Prescott and Thomas Sargent – called calibration.

In an interview by Seppo Honkapohja and Lee Evans (Macroeconomic Dynamics 2005, vol. 9) Thomas Sargent says:

Evans and Honkapohja: What were the profession’s most important responses to the Lucas Critique?

Sargent: There were two. The first and most optimistic response was complete rational expectations econometrics. A rational expectations equilibrium is a likelihood function. Maximize it.

Evans and Honkapohja: Why optimistic?

Sargent: You have to believe in your model to use the likelihood function. it provides a coherent way to estimate objects of interest (preferences, technologies, information sets, measurement processes) within the context of a trusted model.

Evans and Honkapohja: What was the second response?

Sargent: Various types of calibration. Calibration is less optimistic about what your theory can accomplish because you would only use it if you din’t fully trust your entire model, meaning that you think your model is partly misspecified or incompetely specified, or if you trusted someone else’s model and data set more than your own. My recollection is that Bob Lucas and Ed Prescott were initially very enthusiastic about rational expetations econometrics. After all, it simply involved imposing on ourselves the same high standards we had criticized the Keynesians for failing to live up to. But after about five years of doing likelihood ratio tests on rational expectations models, I recall Bob Lucas and Ed Prescott both telling me that those tests were rejecting too many good models. The idea of calibration is to ignore some of the probabilistic implications of your model but to retain others. Somehow, calibration was intended as a balanced response to professing that your model, although not correct, is still worthy as a vehicle for quantitative policy analysis….

Evans and Honkapohja: Do you think calibration in macroeconomics was an advance?

Sargent: In many ways, yes. I view it as a constructive response to Bob’ remark that “your likelihood ratio tests are rejecting too many good models”. In those days… there was a danger that skeptics and opponents would misread those likelihood ratio tests as rejections of an entire class of models, which of course they were not…. The unstated cse for calibration was that it was a way to continue the process of acquiring experience in matching rational expectations models to data by lowering our standards relative to maximum likelihood, and emphasizing those features of the data that our models could capture. Instead of trumpeting their failures in terms of dismal likelihood ratio statistics, celebrate the featuers that they could capture and focus attention on the next unexplained feature that ought to be explained. One can argue that this was a sensible response… a sequential plan of attack: let’s first devote resources to learning how to create a range of compelling equilibrium models to incorporate interesting mechanisms. We’ll be careful about the estimation in later years when we have mastered the modelling technology…

But is the Lucas-Kydland-Prescott-Sargent calibration really an advance?

Let’s see what two eminent econometricians have to say. In Journal of Economic Perspective (1996, vol. 10) Lars Peter Hansen and James J. Hickman writes:

It is only under very special circumstances that a micro parameter such as the intertemporal elasticity of substitution or even a marginal propensity to consume out of income can be ‘plugged into’ a representative consumer model to produce an empirically concordant aggregate model … What credibility should we attach to numbers produced from their ‘computational experiments’, and why should we use their ‘calibrated models’ as a basis for serious quantitative policy evaluation? … There is no filing cabinet full of robust micro estimats ready to use in calibrating dynamic stochastic equilibrium models … The justification for what is called ‘calibration’ is vague and confusing.

This is the view of econometric methodologist Kevin Hoover :

The calibration methodology, to date,  lacks any discipline as stern as that imposed by econometric methods.

And this is the verdict of Nobel laureate Paul Krugman :

The point is that if you have a conceptual model of some aspect of the world, which you know is at best an approximation, it’s OK to see what that model would say if you tried to make it numerically realistic in some dimensions.

But doing this gives you very little help in deciding whether you are more or less on the right analytical track. I was going to say no help, but it is true that a calibration exercise is informative when it fails: if there’s no way to squeeze the relevant data into your model, or the calibrated model makes predictions that you know on other grounds are ludicrous, something was gained. But no way is calibration a substitute for actual econometrics that tests your view about how the world works.

In physics it may possibly not be straining credulity too much to model processes as ergodic – where time and history do not really matter – but in social and historical sciences it is obviously ridiculous. If societies and economies were ergodic worlds, why do econometricians fervently discuss things such as structural breaks and regime shifts? That they do is an indication of the unrealisticness of treating open systems as analyzable with ergodic concepts.

The future is not reducible to a known set of prospects. It is not like sitting at the roulette table and calculating what the future outcomes of spinning the wheel will be. Reading Sargent and other calibrationists one comes to think of Robert Clower’s apt remark that

much economics is so far removed from anything that remotely resembles the real world that it’s often difficult for economists to take their own subject seriously.

Instead of assuming calibration and rational expectations to be right, one ought to confront the hypothesis with the available evidence. It is not enough to construct models. Anyone can construct models. To be seriously interesting, models have to come with an aim. They have to have an intended use. If the intention of calibration and rational expecteations  is to help us explain real economies, it has to be evaluated from that perspective. A model or hypothesis without a specific applicability is not really deserving our interest.

To say, as Edward Prescott that

one can only test if some theory, whether it incorporates rational expectations or, for that matter, irrational expectations, is or is not consistent with observations

is not enough. Without strong evidence all kinds of absurd claims and nonsense may pretend to be science. We have to demand more of a justification than this rather watered-down version of “anything goes” when it comes to rationality postulates. If one proposes rational expectatons one also has to support its underlying assumptions. None is given, which makes it rather puzzling how rational expectations has become the standard modeling assumption made in much of modern macroeconomics. Perhaps the reason is, as Paul Krugman has it, that economists often mistake

beauty, clad in impressive looking mathematics, for truth.

But I think Prescott’s view is also the reason why calibration economists are not particularly interested in empirical examinations of how real choices and decisions are made in real economies. In the hands of Lucas, Prescott and Sargent rational expectations has been transformed from an – in principle – testable hypothesis to an irrefutable proposition. Irrefutable propositions may be comfortable – like religious convictions or ideological dogmas – but it is not  science.

On useless economics and academic incentives (wonkish)

20 Jul, 2012 at 10:24 | Posted in Economics, Theory of Science & Methodology | 7 Comments

Paul Krugman has a great post up on his blog today, where he gets – almost – every thing right on the state of mainstream economics academia of today:

To put it bluntly: faced with a severe economic crisis — the very kind of crisis macroeconomics was created to deal with — it often seems as if the profession is determined to make itself useless.

Wren-Lewis’s first post concerns the obsession with microfoundations. As he says, this obsession is at this point deeply embedded in the academic incentive structure:

“If you think that only ‘modelling what you can microfound’ is so obviously wrong that it cannot possibly be defended, you obviously have never had a referee’s report which rejected your paper because one of your modelling choices had ‘no clear microfoundations’. One of the most depressing conversations I have is with bright young macroeconomists who say they would love to explore some interesting real world phenomenon, but will not do so because its microfoundations are unclear.”

So where does this come from? The “Lucas critique” has been a big deal in the profession for more than a generation. This says that even if you observe a particular relationship in the real world — say, a relationship between inflation and unemployment — this relationship may change when policy changes. So you really want to have a deeper understanding of where the relationship comes from — “microfoundations” — so that you won’t be caught off guard if it does change in response to policy.

And this is fair enough. But what if you have an observed fact about the world — say, downward wage rigidity — that you can’t easily derive from first principles, but seems to be robust in practice? You might think that the right response is to operate on the provisional assumption that this relationship will continue to hold, rather than simply assume it away because it isn’t properly microfounded — and you’d be right, in my view. But the profession, at least in its academic wing, has largely chosen to take the opposite tack, insisting that if it isn’t microfounded — and with all i’s dotted and t’s crossed, no less — then it’s not publishable or, in the end, thinkable.

Now we’re having a crisis that makes perfect sense if you’re willing to accept some real-world behavior that doesn’t arise from intertemporal maxiimization, but none at all if you aren’t — and to a large extent the academic macroeconomics profession has absented itself from useful discussion.

In the second post Wren-Lewis responds to another tired attack on fiscal stimulus, based on basically nothing. As he says, it’s hard to imagine a clearer case for action than what we’re seeing: overwhelming evidence that fiscal policy does in fact work, zero real interest rates. Yet a substantial number of economists seem determined to find reasons not to act. Some of this is ideology, but I suspect that part of this also represents a carryover from academic careerism, where differentiating your product — claiming that the big guys are wrong at something — is part of what you do to get noticed. This kind of petty stuff doesn’t matter when it’s just academic games, but when it clouds the discussion in the face of mass unemployment, it becomes very bad indeed.

My bottom line is that we as a profession faced the crucial test of our lives — and by and large we failed and continue to fail. It’s not a happy story.

Although I think this is great – and brave, since what Krugman (and Wren-Lewis) is admitting is something we all know is a fact, but few are willing to air publicly – I would like two make two comments.

First: The microfoundational program that sprung out of the “Lucas critique” doesn’t deliver a “deeper understanding” of stable, fundamental relationships in the economy, and so is, at least to me, not possible to characterize as “fair enough.”

Let me elaborate a little.

Neoclassical economic theory today is in the story-telling business whereby economic theorists create make-believe analogue models of the target system – usually conceived as the real economic system. This modeling activity is considered useful and essential. To understand and explain relations between different entities in the real economy the predominant strategy is to build models and make things happen in these “analogue-economy models”.

But how do we bridge the gulf between model and reality? According to Lucas we have to be willing to argue by “analogy from what we know” to what we would like to know. Progress lies in the pursuit of the ambition to “tell better and better stories.”

If the goal of theory is to be able to make accurate forecasts, the ability of a model to imitate actual behavior does not give much leverage. What is required is according to the “Lucas critique” is some kind of invariance of the model’s structure under policy variations. Parametric invariance in an economic model cannot be taken for granted, but to Lucas it seems reasonable to hope that neither tastes nor technology “vary systematically.”

The model should enable us to posit contrafactual questions about what would happen if some variable was to change in a specific way. Hence the assumption of structural invariance, that purportedly enables the theoretical economist to do just that. But does it? Lucas appeals to “reasonable hope”, a rather weak justification for a modeler to apply such a far-reaching assumption. To warrant it one would expect an argumentation that this assumption – whether we conceive of it as part of a strategy of isolation, idealization or successive approximation – really establishes a useful relation that we can export or bridge to the target system, the actual economy.

The basic assumption of this “precise and rigorous” model therefore cannot be considered anything else than an unsubstantiated conjecture as long as it is not supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence have been presented. This is the more tantalizing since Lucas himself stresses that the presumption must be defended on empirical grounds.

And applying a “Lucas critique” on Lucas own model, it is obvious that it too fails. For example, changing “policy rules” cannot just be presumed not to influence investment and consumption behavior and a fortiori technology, thereby contradicting the invariance assumption. Technology and tastes cannot live up to the status of an economy’s deep and structurally stable Holy Grail. They too are part and parcel of an ever-changing and open economy. Lucas hope of being able to model the economy as “a FORTRAN program” therefore seems – from an ontological point of view – totally misdirected. The failure in the attempt to anchor the analysis in the alleged stable deep parameters “tastes” and “technology” shows that if you neglect ontological considerations pertaining to the target system, ultimately reality kicks back when at last questions of bridging and exportation of model exercises are laid on the table. No matter how precise and rigorous the analysis is, and no matter how hard one tries to cast the argument in modern mathematical form they do not push science forwards one millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not per se say anything about external validity.

Formalistic deductive “Glasperlenspiel” can be very impressive and seductive. But in the realm of science it ought to be considered of little or no value to simply make claims about the model and lose sight of reality.

Second: It is rather typical – sad to say, but true – that Krugman once again deliberately fails to mention that heterodox economists – many of whom, as yours truly, are Post Keynesians – haven’t succumbed to the microfoundational plague and that they have been enormously successful in both predicting and explaining the financial crisis that haunts us today. And, from own experience, I can assure you that we have almost insurmountable problems getting things published in the major economics journals – journals run by the hegemonic mainstream neoclassical establishment.

Listen to Larry, Greg!

19 Jul, 2012 at 19:04 | Posted in Economics, Politics & Society | Comments Off on Listen to Larry, Greg!


Lawrence Summers listening to Greg Mankiw’s explications on inequality?

Even though the interest may not be reciprocated,  it would obviously be a good idea for Greg Mankiw to listen to his Harvard colleague Lawrence Summers, instead of trivializing the problems created by increasing inequality! Summers has some interesting  thoughts on why income inequality is on the rise and what to do about it:

Why has the top 1 per cent of the population done so well relative to the rest? The answer probably lies substantially in changing technology and globalisation. When George Eastman revolutionised photography, he did very well and, because he needed a large number of Americans to carry out his vision, the city of Rochester had a thriving middle class for two generations. By contrast, when Steve Jobs revolutionised personal computing, he and the shareholders in Apple (who are spread all over the world) did very well but a much smaller benefit flowed to middle-class American workers both because production was outsourced and because the production of computers and software was not terribly labour intensive …

What then is the right response to rising inequality? There are too few good ideas in current political discourse and the development of better ones is crucial. Here are three.

First, government must be careful that it does not facilitate increases in inequality by rewarding the wealthy with special concessions. Where governments dispose of assets or allocate licences, there is a compelling case for more use of auctions to which all have access. Where government provides insurance implicitly or explicitly, premiums must be set as much as possible on a market basis rather than in consultation with the affected industry. A general posture for government of standing up for capitalism rather than particular well-connected capitalists would also serve to mitigate inequality.

Second, there is scope for pro-fairness, pro-growth tax reform. When there are more and more great fortunes being created and the government is in larger and larger deficit, it is hardly a time for the estate tax to be eviscerated. With smaller families and ever more bifurcation in the investment opportunities open to those with wealth, there is a real risk that the old notion of “shirtsleeves to shirtsleeves in three generations” will become obsolete, and those with wealth will endow dynasties.

Third, the public sector must insure that there is greater equity in areas of the most fundamental importance. It will always be the case in a market economy that some will have mansions, art and the ability to travel in lavish fashion. What is more troubling is that the ability of the children of middle-class families to attend college has been seriously compromised by increasing tuition fees and sharp cutbacks at public universities and colleges.

At the same time, in many parts of the country a gap has opened between the quality of the private school education offered to the children of the rich and the public school educations enjoyed by everyone else. Most alarming is the near doubling over the last generation in the gap between the life expectancy of the affluent and the ordinary.

Neither the politics of polarisation nor those of noblesse oblige will serve to protect the interests of the middle class in the post-industrial economy. We will have to find ways to do better.

Greg Mankiw and Richard Epstein – libertarian mumbo jumbo

19 Jul, 2012 at 15:51 | Posted in Economics, Politics & Society | 7 Comments

As yours truly has commented on earlier, walked-out Harvard economist and George Bush advisor Greg Mankiw  is having problems with explaining the rising inequality we have seen for the last 30 years in both the US and elsewhere in Western societies. Mankiw writes:

Even if the income gains are in the top 1 percent, why does that imply that the right story is not about education?

I then realized that Paul is making an implicit assumption–that the return to education is deterministic. If indeed a year of schooling guaranteed you precisely a 10 percent increase in earnings, then there is no way increasing education by a few years could move you from the middle class to the top 1 percent.

But it may be better to think of the return to education as stochastic. Education not only increases the average income a person will earn, but it also changes the entire distribution of possible life outcomes. It does not guarantee that a person will end up in the top 1 percent, but it increases the likelihood. I have not seen any data on this, but I am willing to bet that the top 1 percent are more educated than the average American; while their education did not ensure their economic success, it played a role.

This is, of course, nothing but really one big evasive action trying to explain away a very disturbing structural shift that has taken place in our societies. And change that has very little to do with stochastic returns to education. Those were in place also 30 or 40 years ago. At that time they meant that perhaps a CEO earned 10-12 times what “ordinary” people earns. Today it means that they perhaps earn 100-200 times  what “ordinary” people earns. A question of education? No way! It is a question of greed and a lost sense of a common project of building a sustainable society. A result of stochastic returns to education? No, this has to do with income and wealth increasingly being concentrated in the hands of a very small and privileged elite.

Mankiw has stubbornly refused to nudge on his libertarian stance on this issue. So, rather consistently, he links on his blog to a PBS-interview with libertarian professor of law, Richard Epstein:

RICHARD EPSTEIN: What’s good about inequality is if, in fact, it turns out that inequality creates an incentive for people to produce and to create wealth, it’s a wonderful force for innovation.

PAUL SOLMAN: Aren’t many of the top 1 percent or 0.1 percent in this country rich because they’re in finance?

RICHARD EPSTEIN: Yes. Many of the very richest people in the United States are rich because they are in finance.

And one of the things you have to ask is, why is anyone prepared to pay them huge sums of money if in fact they perform nothing of social value? And the answer is that when you try to knock out the financiers, what you do is you destroy the liquidity of capital markets. And when you destroy the liquidity of those markets, you make it impossible for businesses to invest, you make it impossible for people to buy home mortgages and so forth, and all sorts of other breakdowns.

So they should be rich. It doesn’t bother me.

PAUL SOLMAN: Are you worried that a small number of people controlling a disproportionate share of the wealth can control a democratic system?

RICHARD EPSTEIN: Oh, my God no.

Mankiw does not in any way comment on Epstein’s amazing stupidities or gives us a hint of why he has chosen to link to the interview. But sometimes silence perhaps says more than a thousand words …

Now, compare that mumbo jumbo with what a true liberal has to say on the issue:

The outstanding faults of the economic society in which we live are its failure to provide for full employment and its arbitrary and inequitable distribution of wealth and incomes … I believe that there is social and psychological justification for significant inequalities of income and wealth, but not for such large disparities as exist to-day.

John Maynard Keynes wrote this in General Theory (1936). Seventy five years later it looks like this in the UK, the US and Sweden:

Source: The Top Incomes Database and own calculations

Nobel laureate Joseph Stiglitz has some very interesting thoughts – in Vanity Fair  – on what this increasing economic inequality do to our societies:

Some people look at income inequality and shrug their shoulders. So what if this person gains and that person loses? What matters, they argue, is not how the pie is divided but the size of the pie. That argument is fundamentally wrong. An economy in which most citizens are doing worse year after year—an economy like America’s—is not likely to do well over the long haul … Perhaps most important, a modern economy requires “collective action”—it needs government to invest in infrastructure, education, and technology … The more divided a society becomes in terms of wealth, the more reluctant the wealthy become to spend money on common needs … America’s inequality distorts our society in every conceivable way. There is, for one thing, a well-documented lifestyle effect—people outside the top 1 percent increasingly live beyond their means. Trickle-down economics may be a chimera, but trickle-down behaviorism is very real … Of all the costs imposed on our society by the top 1 percent, perhaps the greatest is this: the erosion of our sense of identity, in which fair play, equality of opportunity, and a sense of community are so important.

A society where we allow the inequality of incomes and wealth to increase without bounds, sooner or later implodes. The cement that keeps us together erodes and in the end we are only left with people dipped in the ice cold water of egoism and greed. It’s high time to put an end to this the worst Juggernaut of  our time!

Models and successive approximations in economics

19 Jul, 2012 at 13:01 | Posted in Economics, Theory of Science & Methodology | 9 Comments

The ongoing debate on “modern” microfounded macroeconomics raises som very interesting philosophical-methodological issues.

Macroeconomist Simon Wren-Lewis writes on his blog (emphasis added):

Stuff like we cannot possibly take microfounded macro seriously, because it is based on an all-embracing representative actor equipped with superhuman knowledge and forecasting abilities. To which I feel like shouting – where else do you start? I always say to PhD students, start simple, understand the simple model, and then complicate. So we start with a representative agent. What else could we do?

And in another post:

As an intellectual exercise, the ‘model what you can microfound’ approach can be informative. Hopefully it is also a stepping stone on the way to being able to explain what you see.

What these citations well illustrate is the idea of science advancing through the use of successive approximations.  Is this really a feasible methodology? Let me elaborate a little on why I think not.

Most models in science are representations of something else. Models “stand for” or “depict” specific parts of a “target system” (usually the real world).  All theories and models have to use sign vehicles to convey some kind of content that may be used for saying something of the target system. But purpose-built assumptions – like “rational expectations” or “representative actors” – made solely to secure a way of reaching deductively validated results in mathematical models, are of little value if they cannot be validated outside of the model.

All empirical sciences use simplifying or unrealistic assumptions in their modeling activities. That is not the issue – as long as the assumptions made are not unrealistic in the wrong way or for the wrong reasons.

Theories are difficult to directly confront with reality. Economists therefore build models of their theories. Those models are representations that are directly examined and manipulated to indirectly say something about the target systems.

But models do not only face theory. They also have to look to the world. Being able to model a “credible world,” a world that somehow could be considered real or similar to the real world, is not the same as investigating the real world. Even though all theories are false, since they simplify, they may still possibly serve our pursuit of truth. But then they cannot be unrealistic or false in any way. The falsehood or unrealisticness has to be qualified.

One could of course also ask for robustness  [In a response to a piece on his blog, Wren-Lewis writes: “The paradox of thrift was not based on a microfounded model. But the fact that you can also get it from a microfounded model makes me much more confident that its a robust result!”]  but the “credible world,” even after having tested it for robustness, can still be a far way from reality – and unfortunately often in ways we know are important. Robustness of claims in a model does not per se give a warrant for exporting the claims to real world target systems.

Anyway, robust theorems are exceedingly rare or non-existent in macroeconomics. Explanation, understanding and prediction of real world phenomena, relations and mechanisms therefore cannot be grounded (solely) on robustness analysis. Some of the standard assumptions made in neoclassical economic theory – on rationality, information handling and types of uncertainty – are not possible to make more realistic by de-idealization or successive approximations without altering the theory and its models fundamentally.

If we cannot show that the mechanisms or causes we isolate and handle in our models are stable, in the sense that what when we export them from are models to our target systems they do not change from one situation to another, then they only hold under ceteris paribus conditions and a fortiori are of limited value for our understanding, explanation and prediction of our real world target system.

The obvious ontological shortcoming of a basically epistemic – rather than ontological – approach such as “successive approximations” is that “similarity” or “resemblance” tout court do not guarantee that the correspondence between model and target is interesting, relevant, revealing or somehow adequate in terms of mechanisms, causal powers, capacities or tendencies. No matter how many convoluted refinements of concepts made in the model, if the “successive approximations” do not result in models similar to reality in the appropriate respects (such as structure, isomorphism etc), the surrogate system becomes a substitute system that does not bridge to the world but rather misses its target.

So, I have to conclude that constructing “minimal macroeconomic models” or using microfounded macroeconomic models as “stylized facts” or “stylized pictures” somehow “successively approximating” macroeconomic reality, is a rather unimpressive attempt at legitimizing using fictitious idealizations for reasons more to do with model tractability than with a genuine interest of understanding and explaining features of real economies. Many of the model assumptions standardly made by neoclassical macroeconomics are restrictive rather than harmless and could a fortiori anyway not in any sensible meaning be considered approximations at all.

On rational expectations and the communism of macroeconomic models

19 Jul, 2012 at 10:59 | Posted in Economics, Theory of Science & Methodology | Comments Off on On rational expectations and the communism of macroeconomic models

Professor John Kay has a  marvelous article in  Financial Times on why “modern” macroeconomics – based on “rational expectations” and “representative actors” – fails. Kay writes:

Prof Sargent and colleagues appropriated the term “rational expectations” for their answer. Suppose the economic world evolves according to some predetermined model, in which uncertainties are “known unknowns” that can be described by probability distributions. Then economists could gradually deduce the properties of this model, and businesses and individuals would naturally form expectations in that light. If they did not, they would be missing obvious opportunities for advantage.

This approach, which postulates a universal explanation into which economists have privileged insight, was as influential as it was superficially attractive. But a scientific idea is not seminal because it influences the research agenda of PhD students. An important scientific advance yields conclusions that differ from those derived from other theories, and establishes that these divergent conclusions are supported by observation. Yet as Prof Sargent disarmingly observed, “such empirical tests were rejecting too many good models” in the programme he had established with fellow Nobel laureates Bob Lucas and Ed Prescott. In their world, the validity of a theory is demonstrated if, after the event, and often with torturing of data and ad hoc adjustments that are usually called “imperfections”, it can be reconciled with already known facts – “calibrated”. Since almost everything can be “explained” in this way, the theory is indeed universal; no other approach is necessary, or even admissible. Asked “do you think that differences among people’s models are important aspects of macroeconomic policy debates”, Prof Sargent replied: “The fact is you simply cannot talk about their differences within the typical rational expectations model. There is a communism of models. All agents within the model, the econometricians, and God share the same model.”

Rational expectations consequently fail for the same reason communism failed – the arrogance and ignorance of the monopolist. In their critique of rational expectations, Roman Frydman and Michael Goldberg employ Hayek’s critique of planning; the market economy, unlike communism, can mediate different perceptions of the world, bringing together knowledge whose totality is not held by anyone. God did not vouchsafe his model to us, mortals see the present imperfectly and the future dimly, and use many different models. Some agents made profits, some losses, and the financial crisis of 2007-08 decided which was which. Only Prof Sargent’s econometricians were wedded to a single model and could, as usual, explain the crisis only after it had occurred. For them, the crisis was a random shock, but the occasion for a Nobel prize.

One might perhaps find it odd to juxtapose God and people, but as Leonard Rapping said (in Arjo Klamer, The New Classical Macroeconomics 1984, p 234):

Frankly, I do not think that the rational expectations theorists are in the real world. Their approach is much to abstract.

The first microfounded macroeconomist

18 Jul, 2012 at 21:10 | Posted in Varia | Comments Off on The first microfounded macroeconomist

Straight from the horse’s mouth on “rational expectations”

18 Jul, 2012 at 20:23 | Posted in Economics | 1 Comment

As we have seen, Oxford professor Simon Wren-Lewis undauntedly goes on defending representative actors and rational expectations models in the microfoundations programme for macroeconomics on his blog.

It may perhaps be interesting to listen to how Mr Rational Expectations himself – Nobel laureate Robert Lucas – values these assumptions in the aftermath of the latest financial crisis:

Kevin Hoover: The Great Recession and the recent financial crisis have been widely viewed in both popular and professional commentary as a challenge to rational expectations and to efficient markets … I’m asking you whether you accept any of the blame … there’s been a lot of talk about whether rational expectations and the efficient-markets hypotheses is where we should locate the analytical problems that made us blind.

Robert Lucas: You know, people had no trouble having financial meltdowns in their economies before all this stuff we’ve been talking about came on board. We didn’t help, though; there’s no question about that. We may have focused attention on the wrong things, I don’t know.

Source

« Previous PageNext Page »

Blog at WordPress.com.
Entries and comments feeds.