Prayer (personal)

3 February, 2016 at 13:27 | Posted in Politics & Society | Leave a comment

This one is for you — all you brothers and sisters, struggling to survive in civil wars, or forced to flee your homes, risking your lives on your way to my country — Sweden — or other countries in Europe.

May God be with you.

‘New Keynesianism’ — an unpleasant macroeconomic fairy-tale

3 February, 2016 at 12:38 | Posted in Economics | 2 Comments

The so-called new-Keynesian (or NK) model, has emerged and become a workhorse for policy and welfare analysis … The model starts from the RBC model without capital, and, in its basic incarnation, adds only two imperfections. It introduces monopolistic competition in the goods market. The reason is clear: If the economy is going to have price setters, they better have some monopoly power. It then introduces discrete nominal price setting, using a formulation introduced by Calvo, and which turns out to be the most analytically convenient.

(0) PaperBackFrontCover-FFF•e.inddThe model is simple, analytically convenient, and has largely replaced the IS-LM model as the basic model of fluctuations in graduate courses (although not yet in undergraduate textbooks). Like the IS-LM model, it reduces a complex reality to a few simple equations. Unlike the IS-LM model, it is formally rather than informally derived from optimization by firms and consumers … The costs are that, while tractable, the first two equations of the model are patently false … The aggregate demand equation ignores the existence of investment, and relies on an intertemporal substitution effect in response to the interest rate, which is hard to detect in the data on consumers. The inflation equation implies a purely forward looking behavior of inflation, which again appears strongly at odds with the data …

One striking (and unpleasant) characteristic of the basic New Keynesian model is that there is no unemployment! Movements take place along a labor supply curve, either at the intensive margin (with workers varying hours) or at the extensive margin (with workers deciding whether or not to participate). One has a sense, however, that this may give a misleading description of fluctuations, in positive terms, and, even more so, in normative terms: The welfare cost of fluctuations is often thought to fall disproportionately on the unemployed.

Olivier Blanchard

Wren-Lewis on the New Classical Counter Revolution

2 February, 2016 at 18:19 | Posted in Economics | 5 Comments

Simon Wren-Lewis argues in a new paper — Unravelling the New Classical Counter Revolution — that it is essential to understand the success of the New Classical Counter Revolution (NCCR), if we are going to be able to position Keynes’s  General Theory today. Writes Wren-Lewis:

One other undoubted appeal that the NCCR had was that it allowed macroeconomics to be brought back under the microeconomics umbrella. New Keynesian economists can talk about how business cycles involve an externality (price rigidity reflecting the cost of adjusting individual prices also has an impact on overall demand), and this market failure requires state intervention. This is language their micro colleagues can relate to …

The theoretical insights that New Classical economists brought to the table were impressive: besides rational expectations, there was a rationalisation of permanent income and the life-cycle models using intertemporal optimisation, time inconsistency and more …

A new revolution, that replaces current methods with older ways of doing macroeconomics, seems unlikely and I would argue is also undesirable. The discipline does not need to advance one revolution at a time …

To understand modern academic macroeconomics, it is no longer essential that you start with The General Theory. It is far more important that you read Lucas and Sargent (1979), which is a central text in what is generally known as the New Classical Counter Revolution (NCCR). That gave birth to DSGE models and the microfoundations programme, which are central to mainstream macroeconomics today …

Hmm …

There’s something that just does not sit very well with this picture of modern macroeconomics.

‘Read Lucas and Sargent (1979)’. Yes, why not. One who has read it is Wren-Lewis’s ‘New Keynesian’ buddy Paul Krugman. And this is what he has to say on that reading experience:

Lucas and his school … went even further down the equilibrium rabbit hole, notably with real business cycle theory. And here is where the kind of willful obscurantism Romer is after became the norm. I wrote last year about the remarkable failure of RBC theorists ever to offer an intuitive explanation of how their models work, which I at least hinted was willful:

“But the RBC theorists never seem to go there; it’s right into calibration and statistical moments, with never a break for intuition. And because they never do the simple version, they don’t realize (or at any rate don’t admit to themselves) how fundamentally silly the whole thing sounds, how much it’s at odds with lived experience.”

Paul Krugman

Yours truly, of course, totally agrees with Paul on Lucas’ rabbit hole freshwater school.

And so does Truman F. Bewley:

Lucas and Rapping (1969) claim that cyclical increases in unemployment occur when workers quit their jobs because wages or salaries fall below expectations …

According to this explanation, when wages are unusually low, people become unemployed in order to enjoy free time, substituting leisure for income at a time when they lose the least income …

According to the theory, quits into unemployment increase during recessions, whereas historically quits decrease sharply and roughly half of unremployed workers become jobless because they are laid off … During the recession I studied, people were even afraid to change jobs because new ones might prove unstable and lead to unemployment …

If wages and salaries hardly ever fall, the intertemporal substitution theory is widely applicable only if the unemployed prefer jobless leisure to continued employment at their old pay. However, the attitude and circumstances of the unemployed are not consistent with their having made this choice …

In real business cycle theory, unemployment is interpreted as leisure optimally selected by workers, as in the Lucas-Rapping model. It has proved difficult to construct business cycle models consistent with this assumption and with real wage fluctuations as small as they are in reality, relative to fluctuations in employment.

This is, of course, only what you would expect of New Classical Chicago economists.

So, what’s the problem?

The problem is that sadly enough this extraterrestial view of unemployment is actually shared by Wren-Lewis and other so called ‘New Keynesians’ — a school whose microfounded dynamic stochastic general equilibrium models cannot even incorporate such a basic fact of reality as involuntary unemployment!

Of course, working with microfunded representative agent models, this should come as no surprise. If one representative agent is employed, all representative agents are. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility.

In the basic DSGE models used by most ‘New Keynesians’, the labour market is always cleared – responding to a changing interest rate, expected life time incomes, or real wages, the representative agent maximizes the utility function by varying her labour supply, money holding and consumption over time. Most importantly – if the real wage somehow deviates from its “equilibrium value,” the representative agent adjust her labour supply, so that when the real wage is higher than its “equilibrium value,” labour supply is increased, and when the real wage is below its “equilibrium value,” labour supply is decreased.

In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be.

To Wren-Lewis is seems as though the ‘New Keynesian’ acceptance of rational expectations, representative agents and microfounded DSGE models is something more or less self-evidently good. Not all economists (yours truly included) share that view:

While one can understand that some of the elements in DSGE models seem to appeal to Keynesians at first sight, after closer examination, these models are in fundamental contradiction to Post-Keynesian and even traditional Keynesian thinking. The DSGE model is a model in which output is determined in the labour market as in New Classical models and in which aggregate demand plays only a very secondary role, even in the short run.

12-02-03-ostwärts-dullien-01In addition, given the fundamental philosophical problems presented for the use of DSGE models for policy simulation, namely the fact that a number of parameters used have completely implausible magnitudes and that the degree of freedom for different parameters is so large that DSGE models with fundamentally different parametrization (and therefore different policy conclusions) equally well produce time series which fit the real-world data, it is also very hard to understand why DSGE models have reached such a prominence in economic science in general.

Sebastian Dullien

Neither New Classical nor ‘New Keynesian’ microfounded DSGE macro models have helped us foresee, understand or craft solutions to the problems of today’s economies.

Wren-Lewis ultimately falls back on the same kind of models that he criticize, and it would sure be interesting to once hear him explain how silly assumptions like ‘hyperrationality’ and ‘representative agents’ help him work out the fundamentals of a truly Keynesian macroeconomic analysis.

In a recent paper on modern macroeconomics, another of Wren-Lewis’s ‘New Keynesian’ buddies, macroeconomist Greg Mankiw, wrote:

The real world of macroeconomic policymaking can be disheartening for those of us who have spent most of our careers in academia. The sad truth is that the macroeconomic research of the past three decades has had only minor impact on the practical analysis of monetary or fiscal policy. The explanation is not that economists in the policy arena are ignorant of recent developments. Quite the contrary: The staff of the Federal Reserve includes some of the best young Ph.D.’s, and the Council of Economic Advisers under both Democratic and Republican administrations draws talent from the nation’s top research universities. The fact that modern macroeconomic research is not widely used in practical policymaking is prima facie evidence that it is of little use for this purpose. The research may have been successful as a matter of science, but it has not contributed significantly to macroeconomic engineering.

So, then what is the raison d’être of macroeconomics, if it has nothing to say about the real world and the economic problems out there?

If macoeconomic models – no matter of what ilk – assume representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Macroeconomic theorists – regardless of being ‘New Monetarist’, ‘New Classical’ or ‘New Keynesian’ – ought to do some ontological reflection and heed Keynes’ warnings on using thought-models in economics:

The object of our analysis is, not to provide a machine, or method of blind manipulation, which will furnish an infallible answer, but to provide ourselves with an organized and orderly method of thinking out particular problems; and, after we have reached a provisional conclusion by isolating the complicating factors one by one, we then have to go back on ourselves and allow, as well as we can, for the probable interactions of the factors amongst themselves. This is the nature of economic thinking. Any other way of applying our formal principles of thought (without which, however, we shall be lost in the wood) will lead us into error.

 

So, these are some of my arguments for why I think that Simon Wren-Lewis ought to be even more critical of the present state of macroeconomics than he is. If macroeconomic models – no matter of what ilk –  build on microfoundational assumptions of representative actors, rational expectations, market clearing and equilibrium, and we know that real people and markets cannot be expected to obey these assumptions, the warrants for supposing that conclusions or hypothesis of causally relevant mechanisms or regularities can be bridged, are obviously non-justifiable. Trying to represent real-world target systems with models flagrantly at odds with reality is futile. And if those models are New Classical or ‘New Keynesian’ makes very little difference.

kSimon Wren-Lewis [writes]:

“It is hard to get academic macroeconomists trained since the 1980s to address [large scale Keynesian models] , because they have been taught that these models and techniques are fatally flawed because of the Lucas critique and identification problems … But DSGE models as a guide for policy are also fatally flawed because they are too simple. The unique property that DSGE models have is internal consistency … Take a DSGE model, and alter a few equations so that they fit the data much better, and you have what could be called a structural econometric model. It is internally inconsistent, but because it fits the data better it may be a better guide for policy.”

Nope! Not too simple. Just wrong!

I disagree with Simon. NK models are not too simple. They are simply wrong. There are no ‘frictions’. There is no Calvo Fairy. There are simply persistent nominal beliefs.

Period.

Roger Farmer

Yes indeed. There really is something about the way macroeconomists construct their models nowadays that obviously doesn’t sit right.

Fortunately — when you’ve got tired of the kind of macroeconomic apologetics produced by ‘New Keynesian’ macroeconomists like Wren-Lewis, Mankiw, and Krugman, there still are some real Keynesian macroeconomists to read. One of them — Axel Leijonhufvud — writes:

For many years now, the main alternative to Real Business Cycle Theory has been a somewhat loose cluster of models given the label of New Keynesian theory. New Keynesians adhere on the whole to the same DSGE modeling technology as RBC macroeconomists but differ in the extent to which they emphasise inflexibilities of prices or other contract terms as sources of shortterm adjustment problems in the economy. The “New Keynesian” label refers back to the “rigid wages” brand of Keynesian theory of 40 or 50 years ago. Except for this stress on inflexibilities this brand of contemporary macroeconomic theory has basically nothing Keynesian about it …

I conclude that dynamic stochastic general equilibrium theory has shown itself an intellectually bankrupt enterprise. But this does not mean that we should revert to the old Keynesian theory that preceded it (or adopt the New Keynesian theory that has tried to compete with it). What we need to learn from Keynes … are about how to view our responsibilities and how to approach our subject.

No matter how brilliantly silly ‘New Keynesian’ DSGE models Wren-Lewis and his buddies come up with, they do not help us working with the fundamental issues of modern economies. Using that kind of models only confirms Robert Gordon‘s  dictum that today

rigor competes with relevance in macroeconomic and monetary theory, and in some lines of development macro and monetary theorists, like many of their colleagues in micro theory, seem to consider relevance to be more or less irrelevant.

Georgescu-Roegen on why ‘most of the time all of us talk some nonsense’

2 February, 2016 at 15:24 | Posted in Economics | Leave a comment

grPositivism does not seem to realize at all that the concept of verifiability — or that the position that ‘the meaning of a proposition is the method of its verification’ — is covered by a dialectical penumbra in spite of the apparent rigor of the sentences used in the argument …

I hope the reader will not take offense at the unavoidable conclusion that most of the time all of us talk some nonsense, that is, express our thoughts in dialectical terms with no clear-cut meaning …

The position that dialectical concepts should be barred from science because they would infest it with muddled thinking, is, therefore, a flight of fancy — unfortunately, not an innocuous one. For it has bred another kind of muddle that now plagues large sectors of social sciences: arithmomania. To cite a few cases from economics alone. The complex notion of economic developmet has been reduced to a number, the income per capita. The dialectical spectrum of human wants … has long since been covered under the colorless numerical concept of ‘utility’ for which, moreover, nobody has yet been able to provide an actual procedure of measurement.

In the postwar period, it has become increasingly clear that economic growth has not only brought greater prosperity. The other side of growth, in the form of pollution, contamination and wastage of resources, has emerged as perhaps the greatest challenge of our time.

Against the neoclassical theory’s view on the economy as a balanced and harmonious system, where growth and the environment go hand in hand, ecological economists object that it can rather be characterized as an unstable system that at an accelerating pace consumes energy and matter, and thereby pose a threat against the very basis for its survival.

The Romanian-American economist Nicholas Georgescu-Roegen (1906-1994) argued in his epochal The Entropy Law and the Economic Process (1971) that the economy was actually a giant thermodynamic system in which entropy increases inexorably and our material basis disappears. If we choose to continue to produce with the techniques we have developed, then our society and earth will disappear faster than if we introduce small-scale production, resource-saving technologies and limited consumption.

Following Georgescu-Roegen, ecological economists have argued that industrial society inevitably leads to increased environmental pollution, energy crisis and an unsustainable growth.

After a radio debate with one of the members of the prize committee  twenty years ago, yours truly asked why Georgescu-Roegen hadn’t got the prize. The answer was – mirabile dictu – that he “never founded a school.” Talk about nonsense! I was surprised, to say the least, and wondered if he possibly had heard of the environmental movement. Well, he had – but it was “the wrong kind of school.” Can it be stated much clearer than this what it’s all about? If you haven’t worked within the neoclassical paradigm – then you are excluded a priori from being eligible for the The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel!

Deductivism — the original sin in economics

2 February, 2016 at 10:27 | Posted in Economics | 2 Comments

quine

Mathematics, especially through the work of David Hilbert, became increasingly viewed as a discipline properly concerned with providing a pool of frameworks for possible realities …

This emergence of the axiomatic method removed at a stroke various hitherto insurmountable constraints facing those who would mathematise the discipline of economics. Researchers involved with mathematical projects in economics could, for the time being at least, postpone the day of interpreting their preferred axioms and assumptions. There was no longer any need to seek the blessing of mathematicians and physicists or of other economists who might insist that the relevance of metaphors and analogies be established at the outset. In particular it was no longer regarded as necessary, or even relevant, to economic model construction to consider the nature of social reality, at least for the time being …

The result was that in due course deductivism in economics, through morphing into mathematical deductivism on the back of developments within the discipline of mathematics, came to acquire a new lease of life, with practitioners (once more) potentially oblivious to any inconsistency between the ontological presuppositions of adopting a mathematical modelling emphasis and the nature of social reality. The consequent rise of mathematical deductivism has culminated in the situation we find today.

Tony Lawson

‘Rigorous evidence’? Yes — and totally useless!

1 February, 2016 at 16:31 | Posted in Economics | 8 Comments

So far we have shown that for two prominent questions in the economics of education, experimental and non-experimental estimates appear to be in tension. Furthermore, experimental results across different contexts are often in tension with each other. The first tension presents policymakers with a trade-off between the internal validity of estimates from the “wrong” context, and the greater external validity of observational data analysis from the “right” context. The second tension, between equally well-identifed results across contexts, suggests that the resolution of this trade-off is not trivial. There appears to be genuine heterogeneity in the true causal parameter across contexts.

8407_2008_58931_lThese findings imply that the common practice of ranking evidence by its level of “rigor”,
without respect to context, may produce misleading policy recommendations …

Despite the fact that we have chosen to focus on extremely well-researched literatures,
it is plausible that a development practitioner confronting questions related to class size, private schooling, or the labor-market returns to education would confront a dearth of well-identified, experimental or quasi-experimental evidence from the country or context in which they are working. They would instead be forced to choose between less internally valid OLS estimates, and more internally valid experimental estimates produced in a very different setting. For all five of the examples explored here, the literature provides a compelling case that policymakers interested in minimizing the error of their parameter estimates would do well to prioritize careful thinking about local evidence over rigorously-estimated causal effects from the wrong context.

Lant Pritchett & Justin Sandefur

Randomization — just as econometrics — is basically a deductive method. Given the assumptions (such as manipulability, transitivity, separability, additivity, linearity, etc.) these methods deliver ‘deductive’ inferences. The problem, of course, is that we will never completely know when the assumptions are right. And although randomization may contribute to controlling for confounding, it does not guarantee it, since genuine ramdomness presupposes infinite experimentation and we know all real experimentation is finite. And even if randomization may help to establish average causal effects, it says nothing of individual effects unless homogeneity is added to the list of assumptions. Real target systems are seldom epistemically isomorphic to our axiomatic-deductive models/systems, and even if they were, we still have to argue for the external validity of the conclusions reached from within these epistemically convenient models/systems. Causal evidence generated by randomization procedures may be valid in “closed” models, but what we usually are interested in, is causal evidence in the real world we happen to live in.

Tainted love

1 February, 2016 at 16:17 | Posted in Economics, Varia | Leave a comment

 

Tänkte inte på det …

1 February, 2016 at 12:59 | Posted in Economics | Leave a comment

IMG_0429
[h/t Jeanette Meyer]

Revealed preference and the fundamental flaws of conventional economics

31 January, 2016 at 15:35 | Posted in Economics | 5 Comments

We must learn WHY the argument for revealed preference, which deceived Samuelson, is wrong. As per standard positivist ideas, preferences are internal to the heart and unobservable; hence they cannot be used in scientific theories. So Samuelson came up with the idea of using the observable Choices – unobservable preferences are revealed by observable choices … Yet the basic argument is wrong; one cannot eliminate the unobservable preference from economic theories. Understanding this error, which Samuelson failed to do, is the first knot to unravel, in order to clear our minds and hearts of the logical positivist illusions.

Asad Zaman

Asad Zaman’s blog post made me come to think about an article on revealed preference theory that yours truly wrote almost twenty-five years ago and got published in History of Political Economy (no. 25, 1993).

Paul Samuelson wrote a kind letter and informed me that he was the one who had recommended it for publication. But although he liked a lot in it, he also wrote a comment — published in the same volume of HOPE — saying:

Between 1938 and 1947, and since then as Pålsson Syll points out, I have been scrupulously careful not to claim for revealed preference theory novelties and advantages it does not merit. But Pålsson Syll’s readers must not believe that it was all redundant fuss about not very much.

awongNotwithstanding Samuelson’s comment, I do still think it basically was much fuss about ‘not very much.’

In 1938 Paul Samuelson offered a replacement for the then accepted theory of utility. The cardinal utility theory was discarded with the following words: “The discrediting of utility as a psychological concept robbed it of its possible virtue as an explanation of human behaviour in other than a circular sense, revealing its emptiness as even a construction” (1938, 61). According to Samuelson, the ordinalist revision of utility theory was, however, not drastic enough. The introduction of the concept of a marginal rate of substitution was considered “an artificial convention in the explanation of price behaviour” (1938, 62). One ought to analyze the consumer’s behaviour without having recourse to the concept of utility at all, since this did not correspond to directly observable phenomena. The old theory was criticized mainly from a methodological point of view, in that it used non-observable concepts and propositions.

The new theory should avoid this and thereby shed “the last vestiges of utility analysis” (1938, 62). Its main feature was a consistency postulate which said “if an individual selects batch one over batch two, he does not at the same time select two over one” (1938, 65). From this “perfectly clear” postulate and the assumptions of given demand functions and that all income is spent, Samuelson in (1938) and (1938a), could derive all the main results of ordinal utility theory (single-valuedness and homogeneity of degree zero of demand functions, and negative semi-definiteness of the substitution matrix).

In 1948 Samuelson no longer considered his “revealed preference” approach a new theory. It was then seen as a means of revealing consistent preferences and enhancing the acceptability of the ordinary ordinal utility theory by showing how one could construct an individual’s indifference map by purely observing his market behaviour. Samuelson concluded his article by saying that “[t]he whole theory of consumer’s behavior can thus be based upon operationally meaningful foundations in terms of revealed preference” (1948, 251). As has been shown lately, this is true only if we inter alia assume the consumer to be rational and to have unchanging preferences that are complete, asymmetrical, non-satiated, strictly convex, and transitive (or continuous). The theory, originally intended as a substitute for the utility theory, has, as Houthakker clearly notes, “tended to become complementary to the latter” (1950, 159).

Only a couple of years later, Samuelson held the view that he was in a position “to complete the programme begun a dozen years ago of arriving at the full empirical implications for demand behaviour of the most general ordinal utility analysis” (1950, 369). The introduction of Houthakker’s amendment assured integrability, and by that the theory had according to Samuelson been “brought to a close” (1950, 355). Starting “from a few logical axioms of demand consistency … [one] could derive the whole of the valid utility analysis as corollaries” (1950, 370). Since Samuelson had shown the “complete logical equivalence” of revealed preference theory with the regular “ordinal preference approach,” it follows that “in principle there is nothing to choose between the formulations” (1953, 1). According to Houthakker (1961, 709), the aim of the revealed preference approach is “to formulate equivalent systems of axioms on preferences and on demand functions.”

But if this is all, what has revealed preference theory then achieved? As it turns out, ordinal utility theory and revealed preference theory are – as Wong puts it – “not two different theories; at best, they are two different ways of expressing the same set of ideas” (2006, 118). And with regard to the theoretically solvable problem, we may still concur with Hicks that “there is in practice no direct test of the preference hypothesis” (1956, 58).

Sippel’s experiments showed “a considerable number of violations of the revealed preference axioms” (1997, 1442) and that from a descriptive point of view – as a theory of consumer behaviour – the revealed preference theory was of a very limited value.

Today it seems as though the proponents of revealed preference theory have given up the original 1938-attempt at building a theory on nothing else but observable facts, and settled instead on the 1950-version of establishing “logical equivalences.”

Mas-Collel et al. concludes their presentation of the theory by noting that “for the special case in which choice is defined for all subsets of X [the set of alternatives], a theory based on choice satisfying the weak axiom is completely equivalent to a theory of decision making based on rational preferences” (1995, 14).

When talking of determining people’s preferences through observation, Varian, for example, has “to assume that the preferences will remain unchanged” and adopts “the convention that … the underlying preferences … are known to be strictly convex.” He further postulates that the “consumer is an optimizing consumer.” If we are “willing to add more assumptions about consumer preferences, we get more precise estimates about the shape of indifference curves” (2006, 119-123, author’s italics). Given these assumptions, and that the observed choices satisfy the consistency postulate as amended by Houthakker, one can always construct preferences that “could have generated the observed choices.” This does not, however, prove that the constructed preferences really generated the observed choices, “we can only show that observed behavior is not inconsistent with the statement. We can’t prove that the economic model is correct.”

Kreps holds a similar view, pointing to the fact that revealed preference theory is “consistent with the standard preference-based theory of consumer behavior” (1990, 30).

The theory of consumer behavior has been developed in great part as an attempt to justify the idea of a downward-sloping demand curve. What forerunners like e.g. Cournot (1838) and Cassel (1899) did was merely to assert this law of demand. The utility theorists tried to deduce it from axioms and postulates on individuals’ economic behaviour. Revealed preference theory tried to build a new theory and to put it in operational terms, but ended up with just giving a theory logically equivalent to the old one. As such it also shares its shortcomings of being empirically nonfalsifiable and of being based on unrestricted universal statements.

As Kornai (1971, 133) remarked, “the theory is empty, tautological. The theory reduces to the statement that in period t the decision-maker chooses what he prefers … The task is to explain why he chose precisely this alternative rather than another one.” Further, pondering Amartya Sen’s verdict of the revealed preference theory as essentially underestimating “the fact that man is a social animal and his choices are not rigidly bound to his own preferences only” (1982, 66) and Georgescu-Roegen’s (1966, 192-3) apt description, a harsh assessment of what the theory accomplished should come as no surprise:

georgescu1Lack of precise definition should not … disturb us in moral sciences, but improper concepts constructed by attributing to man faculties which he actually does not possess, should. And utility is such an improper concept … [P]erhaps, because of this impasse … some economists consider the approach offered by the theory of choice as a great progress … This is simply an illusion, because even though the postulates of the theory of choice do not use the terms ‘utility’ or ‘satisfaction’, their discussion and acceptance require that they should be translated into the other vocabulary … A good illustration of the above point is offered by the ingenious theory of the consumer constructed by Samuelson.

Nothing lost, nothing gained.

References
Cassel, Gustav 1899. ”Grundriss einer elementaren Preislehre.” Zeitschrift für die gesamte Staatswissenschaft 55.3:395-458.

Cournot, Augustin 1838. Recherches sur les principes mathématiques de la théorie des richesses. Paris. Translated by N. T. Bacon 1897 as Researches into the Mathematical Principles of the Theory of Wealth. New York: The Macmillan Company.

Georgescu-Roegen, Nicholas 1966. “Choice, Expectations, and Measurability.” In Analytical Economics: Issues and Problems. Cambridge, Massachusetts: Harvard University Press.

Hicks, John 1956. A Revision of Demand Theory. Oxford: Clarendon Press.

Houthakker, Hendrik 1950. “Revealed Preference and the Utility Function.” Economica 17 (May):159-74.
–1961. “The Present State of Consumption Theory.” Econometrica 29 (October):704-40.

Kornai, Janos 1971. Anti-equilibrium. London: North-Holland.

Kreps, David 1990. A Course in Microeconomic Theory. New York: Harvester Wheatsheaf.

Mas-Collel, Andreu et al. 1995. Microeconomic Theory. New York: Oxford University Press.

Samuelson, Paul 1938. “A Note on the Pure Theory of Consumer’s Behaviour.” Economica 5 (February):61-71.
–1938a. “A Note on the Pure Theory of Consumer’s Behaviour: An Addendum.” Economica 5 (August):353-4.
–1947. Foundations of Economic Analysis. Cambridge, Massachusetts: Harvard University Press.
–1948. “Consumption Theory in Terms of Revealed Preference.” Economica 15 (November):243-53.
–1950. “The Problem of Integrability in Utility Theory.” Economica 17 (November):355-85.
–1953. “Consumption Theorems in Terms of Overcompensation rather than Indifference Comparisons.” Economica 20 (February):1-9.

Sen, Amartya (1982). Choice, Welfare and Measurement. London: Basil Blackwell.

Sippel, Reinhard 1997. “An experiment on the pure theory of consumer’s behaviour.” Economic Journal 107:1431-44.

Varian, Hal 2006. Intermediate Microeconomics: A Modern Approach. (7th ed.) New York: W. W. Norton & Company.

Wong, Stanley 2006. The Foundations of Paul Samuelson’s Revealed Preference Theory. (Revised ed.) London: Routledge & Kegan Paul.

Keynes vs. Samuelson on models

31 January, 2016 at 14:44 | Posted in Economics | 1 Comment

411WDSW5BRL._SX331_BO1,204,203,200_To his credit Keynes was not, in contrast to Samuelson, a formalist who was committed to mathematical economics. Keynes wanted models, but for him, building them required ‘ a vigilant observation of the actual working of our system.’ Indeed, ‘to convert a model into a quantitative formula is to destroy its usefulness as an instrument of thought.’ That conclusion can be strongly endorsed!
 

« Previous PageNext Page »

Blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.