Why doesn’t Krugman listen to Krugman?

31 July, 2014 at 22:50 | Posted in Economics | 1 Comment

Paul Krugman wonders why no one listens to academic economists … 

Listening_TitleOne answer is that economists don’t listen to themselves. More precisely, liberal economists like Krugman who want the state to take a more active role in managing the economy, continue to teach an economic theory that has no place for activist policy.

Let me give a concrete example.

One of Krugman’s bugaboos is the persistence of claims that expansionary monetary policy must lead to higher inflation. Even after 5-plus years of ultra-loose policy with no rising inflation in sight, we keep hearing that since so “much money has been created…, there should already be considerable inflation” … As an empirical matter, of course, Krugman is right. But where could someone have gotten this idea that an increase in the money supply must always lead to higher inflation? Perhaps from an undergraduate economics class? Very possibly — if that class used Krugman’s textbook.

Here’s what Krugman’s International Economics says about money and inflation:

“A permanent increase in the money supply causes a proportional increase in the price level’s long-run value. … we should expect the data to show a clear-cut positive association between money supplies and price levels. If real-world data did not provide strong evidence that money supplies and price levels move together in the long run, the usefulness of the theory of money demand we have developed would be in severe doubt …

A permanent increase in the level of a country’s money supply ultimately results in a proportional rise in its price level but has no effect on the long-run values of the interest rate or real output.”

This last sentence is simply the claim that money is neutral in the long run, which Krugman continues to affirm on his blog …

You might think these claims about money and inflation are unfortunate oversights, or asides from the main argument. They are not. The assumption that prices must eventually change in proportion to the central bank-determined money supply is central to the book’s four chapters on macroeconomic policy in an open economy …

So  these are not throwaway lines. The more thoroughly a student understands the discussion in Krugman’s textbook, the stronger should be their belief that sustained expansionary monetary policy must be inflationary. Because if it is not, Krugman gives you no tools whatsoever to think about policy …

Liberal Keynesian economists made a deal with the devil decades ago, when they conceded the theoretical high ground. Paul Krugman the textbook author says authoritatively that money is neutral in the long run and that a permanent increase in the money supply can only lead to inflation. Why shouldn’t people listen to him, and ignore Paul Krugman the blogger?

J. W. Mason/The Slack Wire

My blog is skyrocketing!

31 July, 2014 at 21:59 | Posted in Varia | Leave a comment

happy-cartoon-boy-jumping-and-smiling3

Tired of the idea of an infallible mainstream neoclassical economics and its perpetuation of spoon-fed orthodoxy, yours truly launched this blog in March 2011. The number of visitors has increased steadily, and now, three and a half years later, with almost 125 000 views per month, I have to admit of still being — given the somewhat wonkish character of the blog, with posts mostly on economic theory, statistics, econometrics, theory of science and methodology — rather gobsmacked that so many are interested and take their time to read the often rather geeky stuff on this blog.

In the 21st century the blogosphere has without any doubts become one of the greatest channels for dispersing new knowledge and information. images-4As a blogger I can specia-lize in those particular topics an economist and critical realist professor of social science happens to have both deep knowledge of and interest in. That, of course, also means — in the modern long tail world — being able to target a segment of readers with much narrower and specialized interests than newspapers and magazines as a rule could aim for — and still attract quite a lot of readers.

Economic growth and the male organ — does size matter?

31 July, 2014 at 19:51 | Posted in Economics | Leave a comment

Economic growth has since long interested economists. Not least, the question of which factors are behind high growth rates has been in focus. The factors usually pointed at are mainly economic, social and political variables. In an interesting study from the University of  Helsinki, Tatu Westling has expanded the potential causal variables to also include biological and sexual variables. In  the report Male Organ and Economic Growth: Does Size Matter (2011), he has — based on the “cross-country” data of Mankiw et al (1992), Summers and Heston (1988), Polity IV Project data of political regime types and a new data set on average penis size in 76 non-oil producing countries (www.everyoneweb.com/worldpenissize) — been able to show that the level and growth of GDP per capita between 1960 and 1985 varies with penis size. Replicating Westling’s study — I have used my favourite program Gretl — we obtain the following two charts:


The Solow-based model estimates show that the maximum GDP is achieved with the penis of about 13.5 cm and that the male reproductive organ (OLS without control variables) are negatively correlated with — and able to explain 20% of the variation in — GDP growth.

Even with reservation for problems such as endogeneity and confounders one can not but agree with Westling’s final assessment that “the ‘male organ hypothesis’ is worth pursuing in future research” and that it “clearly seems that the ‘private sector’ deserves more credit for economic development than is typically acknowledged.” Or? …

Keynes and Kyburg — showing Bayesianism to be ‘patently absurd’

31 July, 2014 at 17:59 | Posted in Statistics & Econometrics | 1 Comment

Back in 1991, when I earned my first Ph.D. — with a dissertation on decision making and rationality in social choice theory and game theory — yours truly concluded that “repeatedly it seems as though mathematical tractability and elegance — rather than realism and relevance — have been the most applied guidelines for the behavioural assumptions being made. On a political and social level it is doubtful if the methodological individualism, ahistoricity and formalism they are advocating are especially valid.”

This, of course, was like swearing in church. My mainstream neoclassical colleagues were — to say the least — not exactly überjoyed.

The decision theoretical approach I perhaps was most critical of, was the one building on the then reawakened Bayesian subjectivist interpretation of probability.

One of my inspirations when working on the dissertation was Henry E. Kyburg, and I still think his critique is the ultimate take-down of Bayesian hubris (emphasis added):

From the point of view of the “logic of consistency” (which for Ramsey includes the probability calculus), no set of beliefs is more rational than any other, so long as they both satisfy the quantitative relationships expressed by the fundamental laws of probability. Thus I am free to assign the number I/3 to the probability that the sun will rise tomorrow; or, more cheerfully, to take the probability to be 9/1io that I have a rich uncle in Australia who will send me a telegram tomorrow informing me that he has made me his sole heir. Neither Ramsey, nor Savage, nor de Finetti, to name three leading figures in the personalistic movement, can find it in his heart to detect any logical shortcomings in anyone, or to find anyone logically culpable, whose degrees of belief in various propositions satisfy the laws of the probability calculus, however odd those degrees of belief may otherwise be. Reasonableness, in which Ramsey was also much interested, he considered quite another matter. The connection between rationality (in the sense of conformity to the rules of the probability calculus) and reasonableness (in the ordinary inductive sense) is much closer for Savage and de Finetti than it was for Ramsey, but it is still not a strict connection; one can still be wildly unreasonable without sinning against either logic or probability.

kyburgNow this seems patently absurd. It is to suppose that even the most simple statistical inferences have no logical weight where my beliefs are concerned. It is perfectly compatible with these laws that I should have a degree of belief equal to 1/4 that this coin will land heads when next I toss it; and that I should then perform a long series of tosses (say, 1000), of which 3/4 should result in heads; and then that on the 1001st toss, my belief in heads should be unchanged at 1/4. It could increase to correspond to the relative frequency in the observed sample, or it could even, by the agency of some curious maturity-of-odds belief of mine, decrease to 1/8. I think we would all, or almost all, agree that anyone who altered his beliefs in the last-mentioned way should be regarded as irrational. The same is true, though perhaps not so seriously, of anyone who stuck to his beliefs in the face of what we would ordinarily call contrary evidence. It is surely a matter of simple rationality (and not merely a matter of instinct or convention) that we modify our beliefs, in some sense, some of the time, to conform to the observed frequencies of the corresponding events.

There is another argument against both subjestivistic and logical theories that depends on the fact that probabilities are represented by real numbers … The point can be brought out by considering an old fashioned urn containing black and white balls. Suppose that we are in an appropriate state of ignorance, so that, on the logical view, as well as on the subjectivistic view, the probability that the first ball drawn will be black, is a half. Let us also assume that the draws (with replacement) are regarded as exchangeable events, so that the same will be true of the i-th ball drawn. Now suppose that we draw a thousand balls from this urn, and that half of them are black. Relative to this information both the subjectivistic and the logical theories would lead to the assignment of a conditional probability of 1/2 to the statement that a black ball will be drawn on the 1001st draw …

Although it does seem perfectly plausible that our bets concerning black balls and white balls should be offered at the same odds before and after the extensive sample, it surely does not seem plausible to characterize our beliefs in precisely the same way in the two cases …

This is a strong argument, I think, for considering the measure of rational belief to be two dimensional; and some writers on probability have come to the verge of this conclusion. Keynes, for example, considers an undefined quantity he calls “weight” to reflect the distinction between probability-relations reflecting much relevant evidence, and those which reflect little evidence …

Though Savage distinguishes between these probabilities of which he is sure and those of which he is not so sure, there is no way for him to make this distinction within the theory; there is no internal way for him to reflect the distinction between probabilities which are based on many instances and those which are based on only a few instances, or none at all.

Henry E. Kyburg

The reference Kyburg makes to Keynes and his concept of “weight of argument” is extremely interesting.

Almost a hundred years after John Maynard Keynes wrote his seminal A Treatise on Probability (1921), it is still very difficult to find statistics textbooks that seriously try to incorporate his far-reaching and incisive analysis of induction and evidential weight.

treatprobThe standard view in statistics – and the axiomatic probability theory underlying it – is to a large extent based on the rather simplistic idea that “more is better.” But as Keynes argues – “more of the same” is not what is important when making inductive inferences. It’s rather a question of “more but different.”

Variation, not replication, is at the core of induction. Finding that p(x|y) = p(x|y & w) doesn’t make w “irrelevant.” Knowing that the probability is unchanged when w is present gives p(x|y & w) another evidential weight (“weight of argument”). Running 10 replicative experiments do not make you as “sure” of your inductions as when running 10 000 varied experiments – even if the probability values happen to be the same.

According to Keynes we live in a world permeated by unmeasurable uncertainty – not quantifiable stochastic risk – which often forces us to make decisions based on anything but “rational expectations.” Keynes rather thinks that we base our expectations on the confidence or “weight” we put on different events and alternatives. To Keynes expectations are a question of weighing probabilities by “degrees of belief,” beliefs that often have preciously little to do with the kind of stochastic probabilistic calculations made by the rational agents as modeled by “modern” social sciences. And often we “simply do not know.” As Keynes writes in Treatise:

The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be [that] the system of the material universe must consist of bodies … such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state … Yet there might well be quite different laws for wholes of different degrees of complexity, and laws of connection between complexes which could not be stated in terms of laws connecting individual parts … If different wholes were subject to different laws qua wholes and not simply on account of and in proportion to the differences of their parts, knowledge of a part could not lead, it would seem, even to presumptive or probable knowledge as to its association with other parts … These considerations do not show us a way by which we can justify induction … /427 No one supposes that a good induction can be arrived at merely by counting cases. The business of strengthening the argument chiefly consists in determining whether the alleged association is stable, when accompanying conditions are varied … /468 In my judgment, the practical usefulness of those modes of inference … on which the boasted knowledge of modern science depends, can only exist … if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appears more and more clearly as the ultimate result to which material science is tending.

Science according to Keynes should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts.” Models can never be more than a starting point in that endeavour. He further argued that it was inadmissible to project history on the future. Consequently we cannot presuppose that what has worked before, will continue to do so in the future. That statistical models can get hold of correlations between different “variables” is not enough. If they cannot get at the causal structure that generated the data, they are not really “identified.”

How strange that writers of statistics textbook as a rule do not even touch upon these aspects of scientific methodology that seems to be so fundamental and important for anyone trying to understand how we learn and orient ourselves in an uncertain world. An educated guess on why this is a fact would be that Keynes concepts are not possible to squeeze into a single calculable numerical “probability.” In the quest for quantities one puts a blind eye to qualities and looks the other way – but Keynes ideas keep creeping out from under the statistics carpet.

It’s high time that statistics textbooks give Keynes his due — and to re-read Henry E. Kyburg!

Nancy Cartwright on RCTs

31 July, 2014 at 08:56 | Posted in Theory of Science & Methodology | Leave a comment

16720017-abstract-word-cloud-for-randomized-controlled-trial-with-related-tags-and-termsI’m fond of science philosophers like Nancy Cartwright. With razor-sharp intellects they immediately go for the essentials. They have no time for bullshit. And neither should we.

In Evidence: For Policy — downloadable here — Cartwirght has assembled her papers on how better to use evidence from the sciences “to evaluate whether policies that have been tried have succeeded and to predict whether those we are thinking of trying will produce the outcomes we aim for.” Many of the collected papers center around what can and cannot be inferred from results in well-done randomised controlled trials (RCTs).

A must-read for everyone with an interest in the methodology of science.

Wren-Lewis on economic methodology

30 July, 2014 at 17:09 | Posted in Economics | 2 Comments

Simon Wren-Lewis has a post up today discussing why the New Classical Counterrevolution (NCCR) was successful in replacing older theories, despite the fact that the New Classical models weren’t able to explain what happened to output and inflation in the 1970s and 1980s:

The new theoretical ideas New Classical economists brought to the table were impressive, particularly to those just schooled in graduate micro. Rational expectations is the clearest example …

However, once the basics of New Keynesian theory had been established, it was quite possible to incorporate concepts like rational expectations or Ricardian Eqivalence into a traditional structural econometric model (SEM) …

The real problem with any attempt at synthesis is that a SEM is always going to be vulnerable to the key criticism in Lucas and Sargent, 1979: without a completely consistent microfounded theoretical base, there was the near certainty of inconsistency brought about by inappropriate identification restrictions …

So why does this matter? … If mainstream academic macroeconomists were seduced by anything, it was a methodology – a way of doing the subject which appeared closer to what at least some of their microeconomic colleagues were doing at the time, and which was very different to the methodology of macroeconomics before the NCCR. The old methodology was eclectic and messy, juggling the competing claims of data and theory. The new methodology was rigorous!

Wren-Lewis seems to be überimpressed by the “rigour” brought to macroeconomics by the New Classical counterrevolution and its rational expectations, microfoundations and ‘Lucas Critique’.

I fail to see why.

Contrary to what Wren-Lewis seems to argue, I would say the recent economic crisis and the fact that New Classical economics has had next to nothing to contribute in understanding it, shows that New Classical economics is a degenerative research program in dire need of replacement.

The predominant strategy in mainstream macroeconomics today is to build models and make things happen in these “analogue-economy models.” But although macro-econometrics may have supplied economists with rigorous replicas of real economies, if the goal of theory is to be able to make accurate forecasts or explain what happens in real economies, this ability to — ad nauseam — construct toy models, does not give much leverage.

“Rigorous” and “precise” New Classical models cannot be considered anything else than unsubstantiated conjectures as long as they aren’t supported by evidence from outside the theory or model. To my knowledge no in any way decisive empirical evidence has been presented.

And — applying a “Lucas critique” on New Classical models, it is obvious that they too fail. Changing “policy rules” cannot just be presumed not to influence investment and consumption behavior and a fortiori technology, thereby contradicting the invariance assumption. Technology and tastes cannot live up to the status of an economy’s deep and structurally stable Holy Grail. They too are part and parcel of an ever-changing and open economy. Lucas hope of being able to model the economy as “a FORTRAN program” and “gain some confidence that the component parts of the program are in some sense reliable prior to running it” therefore seems – from an ontological point of view – totally misdirected. The failure in the attempt to anchor the analysis in the alleged stable deep parameters “tastes” and “technology” shows that if you neglect ontological considerations pertaining to the target system, ultimately reality gets its revenge when at last questions of bridging and exportation of model exercises are laid on the table.

No matter how precise and rigorous the analysis is, and no matter how hard one tries to cast the argument in modern mathematical form, they do not push economic science forwards one millimeter if they do not stand the acid test of relevance to the target. No matter how clear, precise, rigorous or certain the inferences delivered inside these models are, they do not per se say anything about real world economies.

keynes-right-and-wrong

RBC and the Lucas-Rapping theory of unemployment

30 July, 2014 at 13:07 | Posted in Economics | 2 Comments

unemployed-thumbLucas and Rapping (1969) claim that cyclical increases in unemployment occur when workers quit their jobs because wages or salaries fall below expectations …

According to this explanation, when wages are unusually low, people become unemployed in order to enjoy free time, substituting leisure for income at a time when they lose the least income …

According to the theory, quits into unemployment increase during recessions, whereas historically quits decrease sharply and roughly half of unremployed workers become jobless because they are laid off … During the recession I studied, people were even afraid to change jobs because new ones might prove unstable and lead to unemployment …

If wages and salaries hardly ever fall, the intertemporal substitution theory is widely applicable only if the unemployed prefer jobless leisure to continued employment at their old pay. However, the attitude and circumstances of the unemployed are not consistent with their having made this choice …

In real business cycle theory, unemployment is interpreted as leisure optimally selected by workers, as in the Lucas-Rapping model. It has proved difficult to construct business cycle models consistent with this assumption and with real wage fluctuations as small as they are in reality, relative to fluctuations in employment.

Truman F. Bewley

This is, of course, only what you would expect of New Classical Chicago economists.

But sadly enough this extraterrestial view of unemployment is actually shared by so called New Keynesians, whose microfounded dynamic stochastic general equilibrium models cannot even incorporate such a basic fact of reality as involuntary unemployment!

Of course, working with microfunded representative agent models, this should come as no surprise. If one representative agent is employed, all representative agents are. The kind of unemployment that occurs is voluntary, since it is only adjustments of the hours of work that these optimizing agents make to maximize their utility.

In the basic DSGE models used by most ‘New Keynesians’, the labour market is always cleared – responding to a changing interest rate, expected life time incomes, or real wages, the representative agent maximizes the utility function by varying her labour supply, money holding and consumption over time. Most importantly – if the real wage somehow deviates from its “equilibrium value,” the representative agent adjust her labour supply, so that when the real wage is higher than its “equilibrium value,” labour supply is increased, and when the real wage is below its “equilibrium value,” labour supply is decreased.

In this model world, unemployment is always an optimal choice to changes in the labour market conditions. Hence, unemployment is totally voluntary. To be unemployed is something one optimally chooses to be.

The final court of appeal for macroeconomic models is the real world.

If substantive questions about the real world are being posed, it is the formalistic-mathematical representations utilized to analyze them that have to match reality, not the other way around.

To Keynes this was self-evident. But obviously not so to New Classical and ‘New Keynesian’ economists.

Look who’s talking!

30 July, 2014 at 09:47 | Posted in Economics | 1 Comment

I think a lot of the work in Keynesian economics has gotten too far away from thinking about individuals and their decisions at all. Keynsians don’t often worry about what actual individuals are doing. They look at mechanical statistical relationships that have no connection with what real individuals are actually doing.

Robert Lucas Interviewed in The Margin

11002110-text-people-who-live-in-glass-houses-should-not-throw-stones-written-by-hand-font-on-bunch-of-coloreAnd this comes from an economist who repeatedly has argued that  progress in economics lies in the pursuit of the ambition to “tell better and better stories”:

We are storytellers, operating much of the time in worlds of make believe. We do not find that the realm of imagination and ideas is an alternative to, or retreat from, practical reality. On the contrary, it is the only way we have found to think seriously about reality. In a way, there is nothing more to this method than maintaining the conviction … that imagination and ideas matter … there is no practical alternative”

Robert Lucas (1988) What Economists Do

An economist who in search of a “technical model-building principle” adapts the rational expectations view, according to which agents’ subjective probabilities are identified “with observed frequencies of the events to be forecast” are coincident with “true” probabilities. A hypothesis that he maintains

will most likely be useful in situations in which the probabilities of interest concern a fairly well defined recurrent event, situations of ‘risk’ [where] behavior may be explainable in terms of economic theory … In cases of uncertainty, economic reasoning will be of no value … Insofar as business cycles can be viewed as repeated instances of essentially similar events, it will be reasonable to treat agents as reacting to cyclical changes as ‘risk’, or to assume their expectations are rational, that they have fairly stable arrangements for collecting and processing information, and that they utilize this information in forecasting the future in a stable way, free of systemic and easily correctable biases.

Robert Lucas (1981) Studies in Business-Cycle Theory

Living in his self-made analogue and unrealistic rational-expectations-glass-house this guy should not throw stones.

On rigour and relevance …

30 July, 2014 at 09:27 | Posted in Varia | 3 Comments

significance_cartoon
[h/t Roger Erickson]

Normativ multikulturalism

29 July, 2014 at 23:03 | Posted in Politics & Society | 3 Comments

Häromdagen lyssnade jag till en manlig journalist som satt i en panel och var mycket upprörd över att invandrare utpekades som kvinnoförtryckare bara för att en del av dem misshandlade sina kvinnor och tvingade dem att bära slöja och hålla sig inomhus. Att skriva om sånt i tidningar var rasistiskt och vi skulle inte inbilla oss att vi var så bra på jämställdhet i Sverige heller! Det finns fortfarande löneskillnader här, så det så! Och förresten är det en kulturfråga!

jeff

I panelen satt ett antal invandrarkvinnor som blev så arga att de nästan fick blodstörtning. Det är skillnad på svenska löneorättvisor och faraonisk omskärelse, hot och “hedersmord”. “Ska vi hålla tyst om vad som händer bara för att inte fläcka våra Mäns rykte?” sa de. “Och om invandrare skulle börja slakta svenska män för ärans skull, vore det då fortfarande en “kulturfråga”?

Katarina Mazetti, Mazettis blandning (2001)

Jag har full förståelse för dessa kvinnors upprördhet.

Vad frågan i grund och botten handlar om är huruvida vi som medborgare i ett modernt demokratiskt samhälle ska tolerera de intoleranta.

Människor i vårt land som kommer från länder eller tillhör grupperingar av olika slag – vars fränder och trosbröder kanske sitter vid makten och styr med brutal intolerans – måste självklart omfattas av vår tolerans. Men lika självklart är att denna tolerans bara gäller så länge intoleransen inte tillämpas i vårt samhälle.

Kultur, identitet, etnicitet, genus, religiositet får aldrig accepteras som grund för intolerans i politiska och medborgerliga hänseenden. I ett modernt demokratiskt samhälle måste människor som tillhör dessa olika grupper kunna räkna med att samhället också skyddar dem mot intoleransens övergrepp. Alla medborgare måste ha friheten och rätten att också ifrågasätta och lämna den egna gruppen. Mot dem som inte accepterar den toleransen måste vi vara intoleranta.

I Sverige har vi länge okritiskt omhuldat en ospecificerad och odefinierad mångkulturalism. Om vi med mångkulturalism menar att det i vårt samhälle finns flera olika kulturer ställer detta inte till med problem. Då är vi alla mångkulturalister.

Men om vi med mångkulturalism menar att det med kulturell tillhörighet och identitet också kommer specifika moraliska, etiska och politiska rättigheter och skyldigheter, talar vi om något helt annat. Då talar vi om normativ mångkulturalism. Och att acceptera normativ mångkulturalism, innebär också att tolerera oacceptabel intolerans, eftersom den normativa mångkulturalismen innebär att specifika kulturella gruppers rättigheter kan komma att ges högre dignitet än samhällsmedborgarens allmänmänskliga rättigheter – och därigenom indirekt bli till försvar för dessa gruppers (eventuella) intolerans. I ett normativt mångkulturalistiskt samhälle kan institutioner och regelverk användas för att inskränka människors frihet utifrån oacceptabla och intoleranta kulturella värderingar.

Den normativa mångkulturalismen innebär precis som främlingsfientlighet och rasism att individer på ett oacceptabelt sätt reduceras till att vara passiva medlemmar av kultur- eller identitetsbärande grupper. Men tolerans innebär inte att vi måste ha en värderelativistisk inställning till identitet och kultur. De som i vårt samhälle i handling visar att de inte respekterar andra människors rättigheter, kan inte räkna med att vi ska vara toleranta mot dem. De som med våld vill tvinga andra människor att underordna sig en speciell grupps religion, ideologi eller ”kultur” är själva ansvariga för den intolerans de måste bemötas med.

Om vi ska värna om det moderna demokratiska samhällets landvinningar måste samhället vara intolerant mot den intoleranta normativa mångkulturalismen. Och då kan inte samhället själv omhulda en normativ mångkulturalism. I ett modernt demokratiskt samhälle måste rule of law gälla – och gälla alla!

Mot dem som i vårt samhälle vill tvinga andra att leva efter deras egna religiösa, kulturella eller ideologiska trosföreställningar och tabun, ska samhället vara intolerant. Mot dem som vill tvinga samhället att anpassa lagar och regler till den egna religionens, kulturens eller gruppens tolkningar, ska samhället vara intolerant. Mot dem som i handling är intoleranta ska vi inte vara toleranta.

Next Page »

Blog at WordPress.com. | The Pool Theme.
Entries and comments feeds.