Piketty and the neoclassical heart of darkness

13 juni, 2014 kl. 20:32 | Publicerat i Economics | 2 kommentarer

If you have an apple and I have an apple and we exchange these apples then you and I will each have one apple.

But if you have an idea and I have an idea and we exchange these ideas, then each of us will have two ideas.

George Bernard Shaw

I came to think about this dictum when today reading yet another Piketty critique — this time by Matthew Rognlie (on which Brad DeLong has some interesting thoughts). As I see it the gist of the critique is that neoclassical economists — force-fed on growth models taking for granted constant returns to scale and diminishing marginal returns in each factor — can’t really accept, or even comprehend, Piketty’s argument that  the rates of return on capital will not diminish.

In Paul Romer’s Endogenous Technological Change (1990) knowledge is made the most important driving force of growth. Knowledge (ideas) are presented as the locomotive of growth — but as Allyn Young, Piero Sraffa and others had shown already in the 1920s, knowledge is also something that has to do with increasing returns to scale and therefore not really compatible with neoclassical economics with its emphasis on constant returns to scale.

Increasing returns generated by non-rivalry between ideas is simply not compatible with pure competition and the simplistic invisible hand dogma. That is probably also the reason why neoclassical economists have been so reluctant to embrace the theory wholeheartedly.

msg-new-way-of-thinking

Neoclassical economics has tried to save itself by more or less substituting human capital for knowledge/ideas. But knowledge or ideas should not be confused with human capital. Although some have problems with the distinction between ideas and human capital in modern endogenous growth theory, this passage gives a succinct and accessible account of the difference:

Of the three statevariables that we endogenize, ideas have been the hardest to bring into the applied general equilibrium structure. The difficulty arises because of the defining characteristic of an idea, that it is a pure nonrival good. A given idea is not scarce in the same way that land or capital or other objects are scarce; instead, an idea can be used by any number of people simultaneously without congestion or depletion.

Because they are nonrival goods, ideas force two distinct changes in our thinking about growth, changes that are sometimes conflated but are logically distinct. Ideas introduce scale effects. They also change the feasible and optimal economic institutions. The institutional implications have attracted more attention but the scale effects are more important for understanding the big sweep of human history.

The distinction between rival and nonrival goods is easy to blur at the aggregate level but inescapable in any microeconomic setting. Picture, for example, a house that is under construction. The land on which it sits, capital in the form of a measuring tape, and the human capital of the carpenter are all rival goods. They can be used to build this house but not simultaneously any other. Contrast this with the Pythagorean Theorem, which the carpenter uses implicitly by constructing a triangle with sides in the proportions of 3, 4 and 5. This idea is nonrival. Every carpenter in the world can use it at the same time to create a right angle.

Of course, human capital and ideas are tightly linked in production and use. Just as capital produces output and forgone output can be used to produce capital, human capital produces ideas and ideas are used in the educational process to produce human capital. Yet ideas and human capital are fundamentally distinct. At the micro level, human capital in our triangle example literally consists of new connections between neurons in a carpenter’s head, a rival good. The 3-4-5 triangle is the nonrival idea. At the macro level, one cannot state the assertion that skill-biased technical change is increasing the demand for education without distinguishing between ideas and human capital.

Paul Krugman also has some interesting thoughts on the history of that dangerous idea — increasing returns: 

I have worked and written on a lot of topics. It is, however, the idea of increasing returns that has been the most important theme in my work. And it is my work in helping to clarify the role that increasing returns plays in economics that is the main excuse I have for my existence. The idea of increasing returns is, of course, a very old one, going back at least to Adam Smith. Nonetheless, until the 1980s economics was heavily dominated by what we may call the Ricardian Simplification: the assumption of constant returns and perfect competition …

The world isn’t really characterized by constant returns, and it was essential to go beyond the Ricardian Simplification, if only to be able to say to the policymakers that we had explored that terrain and found little of use.

If one admits increasing returns into one’s economic model, two other consequences follow. First, increasing returns are intimately bound up with the possibility of multiple equilibria. There can be multiple equilibria in constant-returns models, too, but they are rarely either plausible or interesting. By contrast, it is very easy to be persuaded of both the relevance and importance of multiple equilibria due to increasing returns … Second, once there are interesting multiple equilibria, you need a story about how the economy picks one. The natural stories involve dynamics — the cumulation of initial advantages that may be accidents of history …

All of this is fairly obvious, and indeed the history of thought in economics is littered with manifestos on the need to take into account increasing returns, multiple equilibria, dynamics, and the role of history … Nonetheless, it wasn’t until the 1980s that increasing returns really got into the mainstream of economics. I wasn’t the only one in the movement: Paul Romer, in particular, wrote several papers I wish I had written … applying increasing returns to economic growth …

Paul Krugman Incidents from my career

In one way one might say that increasing returns is the darkness of the neoclassical heart. And this is something most mainstream neoclassical economists don’t really want to talk about. They prefer to look the other way and pretend that increasing returns are possible to seamlessly incorporate into the received paradigm.

A couple of years ago yours truly wrote a review of David Warsh’s great book on growth theory – Knowledge and the wealth of nations – for an economics journal. The editor accepted it for publication – but only if I was willing to lift out the parts where I highlighted Warsh’s discussion of increasing returns to scale and the efforts neoclassical economics over the decades had put into trying to willfully ”forget” this disturbing anomaly. Moral: some dogmas are not to be questioned – at least not if you want to be published!

How to become a big-shot economist

13 juni, 2014 kl. 18:33 | Publicerat i Economics | Kommentarer inaktiverade för How to become a big-shot economist

 

For my daughter Tora, who has just made her first year at Stockhom School of Economics.

Rites

12 juni, 2014 kl. 19:25 | Publicerat i Varia | Kommentarer inaktiverade för Rites

 

Ack Värmeland, du sköna (Jussi Björling)

12 juni, 2014 kl. 15:25 | Publicerat i Varia | Kommentarer inaktiverade för Ack Värmeland, du sköna (Jussi Björling)

 

Neoclassical synthesis — is that really in Keynes?

12 juni, 2014 kl. 11:44 | Publicerat i Economics | 1 kommentar

cateA new second edition of An Encyclopedia of Keynesian Economics is out, featuring accessible, informative and provocative contributions by leading Keynesian scholars working in the tradition of Keynes. For those interested in the debate on Keynes and the Keynesian Revolution it is a must.

[Oh, and yes, yours truly is one of the contributors – as are e. g. David Colander, Sheila Dow, Geoff Harcourt, Donald Moggridge, Paul Samuelson, Robert Solow and Warren Samuels.]

Many of the entries are still highly interesting reads. One of my favourites is Edward McKenna’s and Diane Zannoni’s Neoclassical Synthesis — which sums up with the following words:

Finally it should be noted that the neoclassical economists’ adoption of classical theory of supply tied Keynesian thought to a theory wholly at variance with that advanced by Keynes himself, in Chapter 3 of General Theory. Had the neoclassicals not done this, it is unlikely that they would have neglected the issue of price formation. And, given the nature of Keynes’s theory, it is certain that the issue of income distribution would have assumed far greater prominence than it has in the neoclassical synthesis.

Comparing McKenna’s and Zonnoni’s view with Krugman’s

the doctrine, made famous by Paul Samuelson but actually there in Keynes too, that macroeconomic policy is needed for full employment but once you have that a relatively free-market policy works

or DeLong’s

the *fons et origo of what Stiglitz regards as a major intellectual error – was none other than John Maynard Keynes himself

yours truly, on this issue, definitely sides with McKenna/Zannoni and Stiglitz.

What ought to be on every macroeconomist’s reading list

11 juni, 2014 kl. 13:38 | Publicerat i Economics | 2 kommentarer

What is the problem we wish to solve when we try to construct a rational economic order? … If we possess all the relevant information, if we can start out from a given system of preferences, and if we command complete knowledge of available means, the problem which remains is purely one of logic …

The-Use-of-Knowledge-in-Society_800x600-05_2014-172x230This, however, is emphatically not the economic problem which society faces … The peculiar character of the problem of a rational economic order is determined precisely by the fact that the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess. The economic problem of society is … a problem of the utilization of knowledge which is not given to anyone in its totality.

This character of the fundamental problem has, I am afraid, been obscured rather than illuminated by many of the recent refinements of economic theory … Many of the current disputes with regard to both economic theory and economic policy have their common origin in a misconception about the nature of the economic problem of society. This misconception in turn is due to an erroneous transfer to social phenomena of the habits of thought we have developed in dealing with the phenomena of nature …

To assume all the knowledge to be given to a single mind in the same manner in which we assume it to be given to us as the explaining economists is to assume the problem away and to disregard everything that is important and significant in the real world.

Compare this relevant and realist wisdom with the rational expectations hypothesis (REH) used by almost all mainstream macroeconomists today. REH presupposes – basically for reasons of consistency – that agents have complete knowledge of all of the relevant probability distribution functions. And when trying to incorporate learning in these models – trying to take the heat of some of the criticism launched against it up to date – it is always a very restricted kind of learning that is considered. A learning where truly unanticipated, surprising, new things never take place, but only rather mechanical updatings – increasing the precision of already existing information sets – of existing probability functions.

Nothing really new happens in these ergodic models, where the statistical representation of learning and information is nothing more than a caricature of what takes place in the real world target system. This follows from taking for granted that people’s decisions can be portrayed as based on an existing probability distribution, which by definition implies the knowledge of every possible event (otherwise it is in a strict mathematical-statistically sense not really a probability distribution) that can be thought of taking place.

But in the real world it is – as shown again and again by behavioural and experimental economics – common to mistake a conditional distribution for a probability distribution. Mistakes that are impossible to make in the kinds of economic analysis – built on the rational expectations hypothesis – that Levine is such an adamant propagator for. On average rational expectations agents are always correct. But truly new information will not only reduce the estimation error but actually change the entire estimation and hence possibly the decisions made. To be truly new, information has to be unexpected. If not, it would simply be inferred from the already existing information set.

In rational expectations models new information is typically presented as something only reducing the variance of the parameter estimated. But if new information means truly new information it actually could increase our uncertainty and variance (information set (A, B) => (A, B, C)).

Truly new information give birth to new probabilities, revised plans and decisions – something the rational expectations hypothesis cannot account for with its finite sampling representation of incomplete information.

In the world of rational expectations, learning is like being better and better at reciting the complete works of Shakespeare by heart – or at hitting bull’s eye when playing dart. It presupposes that we have a complete list of the possible states of the world and that by definition mistakes are non-systematic (which, strictly seen, follows from the assumption of “subjective” probability distributions being equal to the “objective” probability distribution). This is a rather uninteresting and trivial kind of learning. It is a closed world learning, synonymous to improving one’s adaptation to a world which is fundamentally unchanging. But in real, open world situations, learning is more often about adapting and trying to cope with genuinely new phenomena.

The rational expectations hypothesis presumes consistent behaviour, where expectations do not display any persistent errors. In the world of rational expectations we are always, on average, hitting the bull’s eye. In the more realistic, open systems view, there is always the possibility (danger) of making mistakes that may turn out to be systematic. It is because of this, presumably, that we put so much emphasis on learning in our modern knowledge societies.

As Hayek wrote:

When it comes to the point where [equilibrium analysis] misleads some of our leading thinkers into believing that the situation which it describes has direct relevance to the solution of practical problems, it is high time that we remember that it does not deal with the social process at all and that it is no more than a useful preliminary to the study of the main problem.

Added: Just in case you’re contemplating commenting on this post and think that Hayek is all bad — read this!

Rethinking Economics Conference in London June 28-29

10 juni, 2014 kl. 19:57 | Publicerat i Economics | 1 kommentar

rethink london
 
You find more on the conference here

Brothers in Arms

10 juni, 2014 kl. 18:27 | Publicerat i Varia | Kommentarer inaktiverade för Brothers in Arms

 

Piketty and reasonable estimates of depreciation rates (wonkish)

10 juni, 2014 kl. 10:22 | Publicerat i Economics | Kommentarer inaktiverade för Piketty and reasonable estimates of depreciation rates (wonkish)

Thomas Piketty emails:

”We do provide long run series on capital depreciation in the “Capital is back” paper with Gabriel [Zucman] (see http://piketty.pse.ens.fr/capitalisback, appendix country tables US.8, JP.8, etc.). The series are imperfect and incomplete, but they show that in pretty much every country capital depreciation has risen from 5-8% of GDP in the 19th century and early 20th century to 10-13% of GDP in the late 20th and early 21st centuries, i.e. from about 1%[/year] of capital stock to about 2%[/year].

DepreciationOf course there are huge variations across industries and across assets, and depreciation rates could be a lot higher in some sectors. Same thing for capital intensity.

The problem with taking away the housing sector (a particularly capital-intensive sector) from the aggregate capital stock is that once you start to do that it’s not clear where to stop (e.g., energy is another capital intensive sector). So we prefer to start from an aggregate macro perspective (including housing). Here it is clear that 10% or 5% depreciation rates do not make sense.”

No, James Hamilton, it is not the case that the fact that “rates of 10-20%[/year] are quite common for most forms of producers’ machinery and equipment” means that 10%/year is a reasonable depreciation rate for the economy as a whole–and especially not for Piketty’s concept of wealth, which is much broader than simply produced means of production.

No, Per Krusell and Anthony Smith, the fact that “we conducted a quick survey among macroeconomists at the London School of Economics, where Tony and I happen to be right now, and the average answer was 7%[/year” for “the” depreciation rate does not mean that you have any business using a 10%/year economy-wide depreciation rate in trying to assess how the net savings share would respond to increases in Piketty’s wealth-to-annual-net-income ratio.

Who are these London School of Economics economists who think that 7%/year is a reasonable depreciation rate for a wealth concept that attains a pre-World War I level of 7 times a year’s net national income? I cannot imagine any of the LSE economists signing on to the claim that back before WWI capital consumption in northwest European economies was equal to 50% of net income–that depreciation was a third of gross economic product…

Brad DeLong

Some of the critics of Piketty obviously think that the issue is theoretical and that somehow he has misspecified the standard growth model. Now, Piketty doesn’t really talk that much about the standard (Solow) growth model in the book, but let’s do a back of the envelope analysis based on that model and say we have that diehard neoclassical model (assuming the production function is homogeneous of degree one and unlimited substitutability) such as the standard Cobb-Douglas production function (with A a given productivity parameter, and k  the ratio of capital stock to labor, K/L) y = Akα , with a constant investment λ out of output y and a constant depreciation rate δ of the “capital per worker” k, where the rate of accumulation of k, Δk = λyδk, equals Δk = λAkαδk. In steady state (*) we have λAk*α = δk*, giving λ/δ = k*/y* and k* = (λA/δ)1/(1-α)Putting this value of k* into the production function, gives us the steady state output per worker level y* = Ak*α= A1/(1-α)(λ/δ))α/(1-α)Assuming we have an exogenous Harrod-neutral technological progress that increases y with a growth rate g (assuming a zero labour growth rate and with y and k a fortiori now being refined as y/A and k/A respectively, giving the production function as y = kα) we get dk/dt = λy – (g + δ)k, which in the Cobb-Douglas case gives dk/dt = λkα– (g + δ)k, with steady state value k* = (λ/(g + δ))1/(1-αand capital-output ratio k*/y* = k*/k*α = λ/(g + δ). If using Piketty’s preferred model with output and capital given net of depreciation, we have to change the final expression into k*/y* = k*/k*α = λ/(g + λδ). Now what Piketty predicts is that g will fall and that this will increase the capital-output ratio. Let’s say we have δ = 0.03, λ = 0.1 and g = 0.03 initially. This gives a capital-output ratio of around 3. If g falls to 0.01 it rises to around 7.7. We reach analogous results if we use a basic CES production function with an elasticity of substitution σ > 1. With σ = 1.5, the capital share rises from 0.2 to 0.36 if the wealth-income ratio goes from 2.5 to 5, which according to Piketty is what actually has happened in rich countries during the last forty years.

Being able to show that you can get the Piketty results using one or another of the available standard neoclassical growth models is of course — from a realist point of view — of limited value. As usual — the really interesting thing is how in accord with reality are the assumptions you make and the numerical values you put into the model specification.

Divenire

9 juni, 2014 kl. 19:39 | Publicerat i Varia | Kommentarer inaktiverade för Divenire

 

I’m probably not the only blogger/researcher that likes to listen to music while writing/working. Zbigniew Preisner, Philip Glass, Miles Davis, Arvo Pärt and Jan Garbarek are longtime favorites. Ludovico Einaudi is a rather new acquaintance.

Undeniable macroeconomic truths

9 juni, 2014 kl. 13:41 | Publicerat i Economics | 2 kommentarer

”Nominal wages are sticky”: Well, every piece of research I’ve seen on this subject … agrees that nominal wages are sticky, at least in the downward direction. But the kind of exogenous stickiness in most ”New Keynesian” models doesn’t make a lot of sense. So this ”undeniable truth” gets only a provisional pass, since the real ”stickiness” might not affect the economy in the way ”Keynesians” think.
Verdict: True.

unemploy

”A lot of unemployment is involuntary”: The more you think about models of labor and unemployment, the more you realize that ”voluntary” is not a well-defined term. But since many unemployed people definitely seem to think (correctly or incorrectly!) that they can’t find any sort of job, I’ll give this one a provisional pass as well, with the caveat that ”involuntary” is defined in the mind of the unemployed person.
Verdict: True.

Noah Smith

Indeed — a lot of unemployment certainly is ”involuntary.” So, people calling themselves ‘New Keynesians’ ought to be rather embarrassed by the fact that the kind of microfounded dynamic stochastic general equilibrium models they use, cannot incorporate such a basic fact of reality as involuntary unemployment!

Of course, working with microfunded representative agent models, this should come as no surprise. If one representative agent is employed, all representative agents are. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility. Maybe that’s also the reason prominent ‘New Keynesian’ macroeconomist Simon Wren-Lewis can write

I think the labour market is not central, which was what I was trying to say in my post. It matters in a [New Keynesian] model only in so far as it adds to any change to inflation, which matters only in so far as it influences central bank’s decisions on interest rates.

In the basic DSGE models used by most ‘New Keynesians’, the labour market is always cleared – responding to a changing interest rate, expected life time incomes, or real wages, the representative agent maximizes the utility function by varying her labour supply, money holding and consumption over time. Most importantly – if the real wage somehow deviates from its “equilibrium value,” the representative agent adjust her labour supply, so that when the real wage is higher than its “equilibrium value,” labour supply is increased, and when the real wage is below its “equilibrium value,” labour supply is decreased.

In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be.

The final court of appeal for macroeconomic models is the real world, and as long as no convincing justification is put forward for how the inferential bridging de facto is made, macroeconomic modelbuilding is little more than “hand waving” that give us rather little warrant for making inductive inferences from models to real world target systems. If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

To Keynes this was self-evident. But obviously not so to ‘New Keynesians’.

DeLong smacks down on Krusell again

9 juni, 2014 kl. 09:29 | Publicerat i Economics | 4 kommentarer

So I wrote this on Friday, and put it aside because I feared that it might be intemperate, and I do not want to post intemperate things in this space.

pikettyfeverToday, Sunday, I cannot see a thing I want to change–save that I am, once again, disappointed by the quality of critics of Piketty: please step up your game, people!

In response to my Department of “Huh?!”–I Don’t Understand More and More of Piketty’s Critics: Per Krusell and Tony Smith, Per Krusell unfortunately writes:

”Brad DeLong has written an aggressive answer to our short note…. Worry about increasing inequality… is no excuse for [Thomas Piketty’s] using inadequate methodology or misleading arguments…. We provided an example calculation where we assigned values to parameters—among them the rate of depreciation. DeLong’s main point is that the [10%] rate we are using is too high…. Our main quantitative points are robust to rates that are considerably lower…. DeLong’s main point is a detail in an example aimed mainly, it seems, at discrediting us by making us look like incompetent macroeconomists. He does not even comment on our main point; maybe he hopes that his point about the depreciation rate will draw attention away from the main point. Too bad if that happens, but what can we do…”

Let me assure one and all that I focused–and focus–on the depreciation assumption because it is an important and central assumption. It plays a very large role in whether reductions in trend real GDP growth rates (and shifts in the incentive to save driven by shifts in tax regimes, revolutionary confiscation probabilities, and war) can plausibly drive large shifts in wealth-to-annual-income ratios. The intention is not to distract with inessentials. The attention is to focus attention on what is a key factor, as is well-understood by anyone who has control over their use of the Solow growth model.

Consider the Solow growth model Krusell and Smith deploy, calibrated to what Piketty thinks of as typical values for the 1914-80 Social Democratic Era: a population trend growth rate n=1%/yr, a labor-productivity trend growth rate g=2%/yr, W/Y=3. Adding in the Krusell-Smith depreciation assumption of 10%/yr means … that a fall in n+g from 3%/yr to 1%/yr holding the gross savings rate constant generates a rise in the steady-state wealth-to-annual-net-income ratio from 3 to 3.75–not a very big jump for a very large shift in economic growth: the total rate of growth n+g has fallen by 2/3, but W/Y has only jumped by a quarter. Adopting a less ludicrously-awry “Piketty” depreciation assumption of 3%/yr generates quantitatively (and qualitatively!) different results: a rise in the steady-state wealth-to-annual-net-income ratio from 3 to 4.708–the channel is more than twice that powerful …

We have a very large drop in Piketty’s calculations of northwest European economy-wide wealth-to-annual-net-income ratios from the Belle Époque Era that ended in 1914 to the Social Democratic Era of 1914-1980 to account for. How would we account for this other than by (a) reduced incentives for wealthholders to save and reinvest and (b) shifts in trend rates of population and labor-productivity growth? We are now in a new era, with rising wealth-to-annual-net-income ratios. We would like to be able to forecast how far W/Y will rise given the expected evolution of demography and technology and given expectations about incentives for wealthholders to save and reinvest.

How do Krusell and Smith aid us in our quest to do that?

Brad DeLong

Från och med Du (privat)

8 juni, 2014 kl. 20:50 | Publicerat i Varia | Kommentarer inaktiverade för Från och med Du (privat)

 

Per T Ohlsson fullständigt ute och reser om Piketty

8 juni, 2014 kl. 13:29 | Publicerat i Economics | 2 kommentarer

DumstrutEn av landets mest välbetalda journalister — Per T Ohlsson — har i sin återkommande söndagskrönika i Sydvenskan i dag en mer än vanligt dåligt underbyggd artikel om Thomas Piketty’s Capital in the Twenty-First Century.

Följande lilla stycke är belysande:

En svaghet med Capital in the Twenty-First Century är emellertid att Pikettys alarmistiska slutsatser inte bygger på historiska data, utan på teoretiska modeller, ”lagar”, som han själv har konstruerat utifrån högst diskutabla antaganden om sparande och tillväxt. Dessutom bortser Piketty från faktorer som bromsar den utbredning av ojämlikhet som han finner ödesbestämd, till exempel utbildning och ny teknik. Flera tunga ekonomer har påpekat detta, bland dem svensken Per Krusell tillsammans med kollegan Tony Smith från Yale.

Som läsare av denna blog kunnat konstatera den senaste veckan — här, här och här — är detta långt ifrån med verkligheten överensstämmande. Inte minst de ”tunga” ekonomerna Krusell & Smith har — som den ännu ”tyngre” ekonomen Brad DeLong visat — en minst sagt dålig verklighetsförankring vad avser de numeriska värden de laborerat med i sina modellbaserade försök att vederlägga Piketty.

När det kommer till kritan visar det sig att de ”tunga” invändningarna mot Piketty i själva verket väger mycket lätt …

The Last of the Mohicans

7 juni, 2014 kl. 19:13 | Publicerat i Varia | Kommentarer inaktiverade för The Last of the Mohicans

 

Neoclassical economics — immunized against reality

7 juni, 2014 kl. 14:49 | Publicerat i Economics, Theory of Science & Methodology | 1 kommentar

reality-header2
 

The economy, the system of market relationships between members of society, is viewed as a relatively closed network of forces, as a system, which indeed receives a certain external impetus, but functions independently of factors such as those mentioned above, which cannot be ascertained with economic tools …
If one gains clarity about these relation-ships, then one can begin to understand why the models constructed with the help of simple behavioral assumptions by neoclassical oriented theoreticians must be immunized against experience in one way or another if their failure is to be avoided. It is not by chance that the attempts of some proponents of pure economics to achieve autonomous theory formation tend to be translated methodologically into model Platonism: the immunization from the influence of non-economic factors leads to the immunization from experience in general. It appears that the diagnosis of the fundamental methodological weakness of the neoclassical way of thinking must lead to an aversion to sociology. By contrast, regardless of all methodological differences, all heterodox currents in economics characteristically share one element: the accentuation of the significance of social factors for economic relationships and the consciousness of the fact that the social domain analyzed by pure economics is embedded in a more comprehensive social complex that cannot be abstracted away from with no further ado if useful explanations are being sought. The methodological weakness of these currents should not prevent one from seeing what is, in my view, the decisive point, which generally tends to be buried amidst an array of irrelevant arguments about subordinate problems or pseudo-problems, such as those about the applicability of mathematical expressions, the usage of certain types of terms, the question of the preferability of generalizing or pointedly emphasizing abstraction, etc.

Hans Albert

Flummoxed Brad DeLong answers Krusell & Co.

6 juni, 2014 kl. 08:50 | Publicerat i Economics | 5 kommentarer

flummoxedKrusell and Smith favor a deprecation rate of 10%/year – and I genuinely do not understand why they think it is appropriate. We are not, after all, dealing with short-run business-cycle fluctuations in which the pieces of the capital stock that vary are made up mostly of inventories and machines here. We are talking about land, very durable buildings, powerful property rights and the ability to summon the police to protect them – claims over future output that do not, I think, erode away at anything like 10%/year …

Starting around 1980, Piketty argues, the North Atlantic shifted out of its Social-Democratic Era and is now moving into a new configuration, with increasingly-concentrated wealth, savings no longer reduced by highly progressive capital taxation and fear of expropriation, and slower rates of population and labor productivity growth. Piketty expects the consequence to be a rise in the savings rate back to Belle Époque levels and a return to the capital intensity and inherited-wealth dominance of those days. In my view, the next questions are two:

Would this be a good thing? More savings and wealth accumulation by the rich that increase the capital intensity of the economy increase real wages for the working class and the poor, no? Here I think the answer is perhaps – and I think this is what the debate over Piketty should be about. Unfortunately, that is not the debate we are having …

Can this happen? And Krusell and Smith and company are saying: no, it cannot. As the capital-output ratio rises, the desire to consume wealth pushes the gross savings rate goes down and the fact that capital depreciates at 10%/year pushes the net savings rate down much further, and so there are no macroeconomic forces in play that could push the wealth-to-annual-net-income ratio far up above its current value of 300%.

But if the savings rate necessarily falls as the wealth-to-annual-net-income ratio rises, why was the (gross) savings rate half again as high back before World War I when the economy was wealth-dominated as it is today? And from where comes the 10%/year depreciation rate assumption?

What we clearly have here is a failure to communicate. And I really, really do not think that it is the result of a failure to try on Thomas Piketty’s part.

Brad DeLong

[h/t Jan Milch]

Pat Metheny

5 juni, 2014 kl. 22:46 | Publicerat i Varia | Kommentarer inaktiverade för Pat Metheny

 

[h/t Jan Milch]

Inequality in the UK

5 juni, 2014 kl. 17:06 | Publicerat i Economics, Politics & Society | Kommentarer inaktiverade för Inequality in the UK

 

New study shows why trickle-down economics is total horseshit

5 juni, 2014 kl. 11:41 | Publicerat i Economics | 7 kommentarer

The clear connections between wages, income, and living standards mean that progress in reversing inequality, boosting living standards, and alleviating poverty will be extraordinarily difficult without addressing wage growth. Indeed, converting the slow and unequal wage growth of the last three-and-a-half decades into broad-based wage growth is the core economic challenge of our time …

reaganomics_trickle_downDespite increasing economy-wide productivity, wages for the vast majority of American workers have either stagnated or declined since 1979, and this weak wage growth extends even to those with a college degree …

Slow income growth for most American households is mainly due to weak hourly wage growth. In 1979, labor income accounted for 85.1 percent of total income for non-elderly households1 in the broad middle class, yet hourly compensation growth accounted for only about 17 percent of the increase of household incomes between 1979 and 2007—meaning it punched far below its weight …

Key economic evidence implicates policy decisions—and particularly changes in labor market policies and business practices—as more important in explaining the slowdown in hourly wages for the vast majority than many commonly accepted explanations (such as the interaction between technological change and the skills and credentials of American workers).

Josh Bivens et al./Economic Policy Institute

« Föregående sidaNästa sida »

Blogga med WordPress.com.
Entries och kommentarer feeds.