Wednesday, November 29, 2006

Anti-Climate Change Regulation Requires a Salient Event? Justice Scalia asks "When is the Cataclysm?"

Now this is an interesting New York Times piece! It's not about Iraq, or income inequality or executive pay. Instead, it is a glimpse at how the Supreme Court engages in marginal analysis. The Justices appear to want precise estimates of the marginal damages caused by "excess" greenhouse gas emissions. Justice Scalia appears to believe in the importance of salient events as regulatory catalysts. Maybe he has read chapter three of my Green Cities book?

Note that John Roberts seems to understand "all else equal"! Maybe he took Ec 10 at Harvard? Note the idea of the "counter-factual" lurking in this discussion. Note that the Republicans on the court do not appear to be that excited about increasing the EPA's regulatory powers over a new "air pollutant" (i.e carbon dioxide).


November 30, 2006
Justices’ First Brush With Global Warming
By LINDA GREENHOUSE

WASHINGTON, Nov. 29 — A Supreme Court argument Wednesday on the Bush administration’s refusal to regulate carbon dioxide in automobile emissions offered three intertwined plot lines to the audience that had come to watch the court’s first encounter with the issue of global climate change.

On one level, the argument was about the meaning of the Clean Air Act, which the Environmental Protection Agency maintains does not treat carbon dioxide and other heat-trapping gases as air pollutants and thus does not give the agency the authority to regulate them.

On another level, the argument was about whether the dozen states, three cities and many environmental groups that went to federal court to challenge the agency’s position had legal standing to pursue their lawsuit.

And on still another level, the courtroom action was an episode in a policy debate that began well before this case arrived on the Supreme Court’s docket and that will continue, in the political sphere, no matter what the justices decide.

By the end of the argument, that continuing debate appeared the only certain outcome.

The justices seemed deeply divided on the question of standing. Any plaintiff in federal court must establish standing to sue, by proving there is an injury that can be traced to the defendant’s behavior and that will be relieved by the action the lawsuit requests.

Chief Justice John G. Roberts Jr., along with Justices Antonin Scalia and Samuel A. Alito Jr., expressed strong doubts that the plaintiffs, represented by Assistant Attorney General James R. Milkey of Massachusetts, could meet those interrelated conditions by showing that global climate change presented a sufficiently tangible and imminent danger that could be adequately addressed by regulating emissions from new cars and trucks.

“You have to show the harm is imminent,” Justice Scalia instructed Mr. Milkey, asking, “I mean, when is the cataclysm?”

Mr. Milkey replied, “It’s not so much a cataclysm as ongoing harm,” arguing that Massachusetts, New York, and other coastal states faced losing “sovereign territory” to rising sea levels. “So the harm is already occurring,” he said. “It is ongoing, and it will happen well into the future.”

Chief Justice Roberts and Justice Alito both suggested that because motor vehicles account for only about 6 percent of carbon dioxide emissions, even aggressive federal regulation would not be great enough to make a difference, another requirement of the standing doctrine.

When Mr. Milkey replied that over time, “even small reductions can be significant,” Chief Justice Roberts responded: “That assumes everything else is going to remain constant, though, right? It assumes there isn’t going to be a greater contribution of greenhouse gases from economic development in China and other places that’s going to displace whatever marginal benefit you get here.” At another point, the chief justice said the plaintiffs’ evidence “strikes me as sort of spitting out conjecture on conjecture.”

On the other side, Justices Stephen G. Breyer, Ruth Bader Ginsburg, John Paul Stevens and David H. Souter appeared strongly inclined to find that the plaintiffs had met the standing test.

Justice Souter engaged Deputy Solicitor General Gregory G. Garre, the lawyer who was defending the administration’s position, in a long debate. When Mr. Garre said the plaintiffs “haven’t shown specific facts which should provide any comfort to this court that regulation of less than 6 percent or fewer greenhouse emissions worldwide will have any effect on their alleged injuries,” Justice Souter demanded: “Why do they have to show a precise correlation?”

“It is reasonable to suppose,” the justice continued, “that some reduction in the gases will result in some reduction in future loss.” It was “a question of more or less, not a question of either/or,” he said, adding: “They don’t have to stop global warming. Their point is that it will reduce the degree of global warming and likely reduce the degree of loss.”

Mr. Garre replied that given the problem’s global nature, “I’m not aware of any studies available that would suggest that the regulation of that minuscule fraction of greenhouse gas emissions would have any effect whatsoever.”

Then Justice Breyer took on the government lawyer. “Would you be up here saying the same thing if we’re trying to regulate child pornography, and it turns out that anyone with a computer can get pornography elsewhere?” Justice Breyer asked, adding, “I don’t think so.”

By the end of the argument there appeared a strong likelihood that the court would divide 5 to 4 on the standing question, with Justice Anthony M. Kennedy holding the deciding vote. His relatively few comments were ambiguous. Early in the argument he challenged the assertion by Mr. Milkey, the states’ lawyer, that the case “turns on ordinary principles of statutory interpretation and administrative law” and that there was no need for the court “to pass judgment on the science of climate change.”

That was “reassuring,” Justice Kennedy said. But, he added, “Don’t we have to do that in order to decide the standing argument, because there’s no injury if there’s not global warming?”

The justices eventually discussed the substance of the Environmental Protection Agency’s position. Mr. Garre said the agency had “responsibly and prudently” reached the conclusion that “Congress has not authorized it to embark on this regulatory endeavor.”

But the government lawyer seemed defensive when challenged by Justice Scalia on the agency’s view that carbon dioxide was not an air pollutant within the meaning of the Clean Air Act. Mr. Garre referred several times to “the conclusion the agency reached,” an unusual locution that seemed something short of the full embrace that lawyers from the solicitor general’s office usually offer the agencies whose positions they defend.

The Bush administration’s conclusion that the Clean Air Act does not authorize the E.P.A. to address climate change marked an about-face from the agency’s previous view of its legal authority.

The agency’s current position is that even if it had authority, it would choose for various policy reasons not to exercise it. That position was upheld in a fractured ruling by the federal appeals court here, a decision that led to the Supreme Court appeal, Massachusetts v. Environmental Protection Agency, No. 05-1120.

At this stage, even if the plaintiffs survive the challenge to their standing and the court finds that statutory authority exists, it is highly unlikely that the court would order the agency to undertake regulation. It would be a victory, Mr. Milkey agreed, if the justices went so far as to tell the E.P.A. to reconsider its position.

Green Cities Cracks Into one "Top 10" List

It may not be the David Letterman Top Ten list or the New York Times Book Review's top 15 list, but I'll take it! I salute Planetizen for its good job judgment and I'm glad they liked my new book.

"Planetizen is pleased to release its sixth annual list of the ten best books in the planning field. With titles covering some of the most timely issues in planning -- such as the role of planning in the aftermath of disasters, along with the contentious and continuing debate over sprawl and smart growth -- the list gives readers an overview of the best ideas and writing in the field.

The Planetizen editorial staff based its 2007 edition list on a number of criteria, including editorial reviews, sales rankings, popularity, Planetizen reader nominations, number of references, recommendations from experts and the book's potential impact on the urban planning, development and design professions."

http://www.planetizen.com/books/2007

Ed Glaeser Takes on Tom Wolfe

Logical consistency has never been one of my strengths. Today, I wanted to blog about three different unrelated items. First, I wanted to note for the historical record that my Green Cities Amazon.com Sales Rank is slowly improving
# Amazon.com Sales Rank: #16,266 in Books (See Top Sellers in Books). I've been benchmarking my book relative to other recent economics books such as Janet Currie's, Acemoglu and Robinson and David Warsh.

The second item I wanted to note was interesting environmental coverage in the New York Times today: see

http://www.nytimes.com/reuters/technology/tech-autoshow-plugins.html?_r=1&oref=slogin

and

http://www.nytimes.com/reuters/news/news-usa-court-warming.html

With regards to this second article, this is an interesting example of how Republican control of institutions affects environmental policy. The supreme court will rule on this:

"The case, known as Massachusetts v. EPA, was brought by a dozen states and 13 environmental organizations against the Environmental Protection Agency. The plaintiffs argue that the greenhouse gas emissions from cars, trucks and factories should be regulated by the U.S. government.

The EPA, along with 10 states, four motor vehicle trade associations and two coalitions of utility companies and other industries, maintain the agency lacks the authority to limit emissions of greenhouse gases such as carbon dioxide."


For the 3rd item, I wanted to point you to Ed Glaeser's new op-ed piece


Bonfire of the Landmarks

BY EDWARD GLAESER
November 29, 2006
URL: http://www.nysun.com/article/44261


Once again, the dark clouds of a real estate battle darken the sunny cooperatives of the Upper East Side. The neighborhood's great paladins of preservation are sharpening their verbal rapiers to wage war against those nefarious malefactors, the developers who make Manhattan more affordable by building more housing. The great melee on Madison, the fight over new construction on the old Parke-Bernet building, is starting.

Glamorous neighborhood activists will fight developer Aby Rosen and architect Norman Foster for the hearts and minds of the Landmarks Preservation Commission. That commission has the power to block or change any development plans for the site. Sunday, Tom Wolfe wrote a fiery broadside against the allegedly increasing control that developers have over the once independent Landmarks Preservation Commission.

As a student of land-use battles, I am in heaven. Not since Kevin Kline tried to persuade the Landmarks Commission by quoting Richard II on proportionality has a zoning fight been so exciting. I spent last night dreaming about the educational novella that Mr. Wolfe could write to further his cause. Perhaps the developer turned stoic of "A Man in Full" will try to build a shopping mall atop the New York Public Library only to be stopped by the brave opposition of Charlotte Simmons and the less despicable characters from the "The Bonfire of the Vanities."

But my enthusiasm for the fight does not cause me to forget that this is an important battle. There are real benefits and costs from blocking this project. On the benefits side, blocking the project will preserve the view from the Carlyle and save neighbors from the curse of too much modernism as they throng to St. James Episcopal Church.

On the costs side, blocking new development ensures that Manhattan will stay unaffordable. New supply is the only way that middle-income people will be able to afford the city. When supply is choked off, people must pay more for housing, and firms in New York must pay higher salaries to their employees. The mayor has courageously fought for and achieved an increase in granting construction permits because he knows that the city's economic vitality depends on its affordability.

The core issue in this battle is respect for private property. Aby Rosen, not the community, owns 980 Madison Ave. The Landmarks Commission's stripping him of the ability to build would represent a great step in state power over private property. It seems odd for members of the normally non-socialist urban haute bourgeoisie to support so enthusiastically such a massive expropriation of private property.

Seen in this light, the Landmarks Preservation Commission is not a bastion of good government, but an appointed agency with a vast amount of relatively unchecked power. Over the past 40 years, the Landmarks Commission has created 85 historic districts and designated almost 23,000 buildings as landmarks, giving it control over vast tracts of the city's most valuable real estate. The power of the commission is certainly one explanation for the decline in granting building permits and the reduction in the height of new buildings. Mr. Kline's oration was not for naught. Twelve floors got lopped off the project he opposed.

The commission is filled with decent people trying to do what they think is right, but the composition of the commission leads them to put too much weight on aesthetics and too little weight on economics. I am sure that the Foster tower atop 980 Madison will be beautiful, but even if it were not, the tower would have economic benefits, and those benefits need to be weighed against any aesthetic costs. The mayor has the skills and incentives to weigh aesthetics and economics. I am less sure about the commission whose 11 members must include three architects, one landscape architect, and a city planner. These professions have a dangerous track record of putting aesthetics ahead of everything else.

The appointed and unsalaried nature of the commission also tilts the battle in favor of the charismatic abutters over the middle-income beneficiaries of new construction. If I were serving as an appointed, unsalaried member of the commission, I would find it hard to withstand the appeals of a great actor or writer. Perhaps, I could remind myself that they were acting in their own self-interest in what objectively is an extreme example of Nimbyism, but their eloquence would surely cause my concerns to wilt.

I am not enthusiastic about a powerful Landmarks Preservation Commission that can impose its aesthetic vision on the city unchecked. We have elected representatives who have their flaws, but they have incentives to take into account the needs of the city beyond the concerns of abutters. If the Landmarks Commission doesn't allow 980 Madison to rise, I hope that the mayor and City Council will rein the commission in.

Mr. Glaeser is the Glimp professor of economics at Harvard, director of the Taubman Center for State and Local Government, and a senior fellow at the Manhattan Institute.

Sunday, November 26, 2006

70,080 Hours of Marriage

Time is our scarce commodity. Have you wondered where the time goes? This e-mail below offers an answer. Would you fill out their survey? I think I will.

On an unrelated note, my wife and I are getting ready to move from Boston to Los Angeles but last wednesday we had a nice experience that one can't have in the sunshine out west. We were eating lunch at the Legal Seafoods near MIT and we spotted Paul Samuelson eating lunch with his wife. We took our five year old son over to chat with Paul and Paul had some charming things to say to my spaced out five year old. Last night, I took my wife, my son and my parents to dinner at the Legal Seafood in Harvard Square. As we walked to our table, I spotted Paul Samuelson again but this time we let him eat in peace. I pointed him out to my parents and my kind mother suggested that Dr. Samuelson needs some hair gel (VO5?) to keep his curls in place. The Cambridge area does have some gravitational pull for academic economists.


Date: Sun, 26 Nov 2006 18:15:27 -0800 (PST) [09:15:27 PM EST]
From: relationship@berkeley.edu
To: Matthew E Kahn
Subject: New York Times 8-year Anniversary Marriage Study
Headers: Show All Headers

Dear Mr. Kahn,

It has been over 70,080 hours since your wedding on May 29th, 1998. Here,
at the University of California at Berkeley, we are investigating the
relationship between marriage announcements and relationship satisfaction
for our Relationships, Emotions, and Life Study (
http://carpediem.berkeley.edu/real/ ). We found your marriage announcement
in the New York Times archive, and we are excited to be able to contact
you to find out more. Therefore, can we ask you for less than half an hour
of your time to learn about the rest of the 70,080 hours?

The study in its entirety consists of you filling out a short survey
online, located here: http://carpediem.berkeley.edu/real/
Based on your answers to some personality questionnaires, we’ll provide
you with interesting, confidential feedback and later provide you with a
summary of the most important findings from our study. Your email and
questionnaire responses will be kept completely confidential and will not
be distributed to anyone.

Thank you very much for considering participating in this study. Your
participation will advance the science on emotion and relationship
satisfaction, as well as provide you with interesting scientific feedback
on your own personality.

If possible, please forward this email to your spouse. If not, your
individual participation is greatly appreciated. (Even five minutes of
your time, for the first part of the survey would help us greatly.) Thank
you very much!

Laura Saslow [ http://socrates.berkeley.edu/keltner/people.htm ]
laura_saslow@berkeley.edu

Dacher Keltner, Ph. D. [
http://psychology.berkeley.edu/faculty/profiles/dkeltner.html ]

Berkeley Social Interaction Laboratory [
http://socrates.berkeley.edu/keltner/ ]

Relationships, Emotions, and Life Study Homepage
http://carpediem.berkeley.edu/real/

A New Urban Economics Blog

Given the importance of cities in the modern economy, it is surprising how little academic blogging there is focused on cities. There seem to be zillions of environmental blogs but few urban blogs. So, I was happy to learn about the birth of a new blog published at http://urbaneconomics.blogspot.com.

These are exciting days in academic urban economics. I've been told that the recent regional science meetings in Toronto were a great success. Ed Glaeser's profile in the New York Times in March 2006 was a salient example of some of the great research being done. See

http://www.nytimes.com/2006/03/05/magazine/305glaeser.1.html?ex=1299214800&en=2c54fd804ddaf7ae&ei=5088/

What surprises me is that many of the leading universities have no tenured urban economists these days. Ask Stanford, Yale, Chicago, Northwestern "who teaches your urban economics class?" and they will tell you that they don't offer the subject.

That's ugly! A majority of the world's people live and work in cities, it might be useful to have a better sense of how spatial considerations affect productivity and quality of life in such cities. This was my prime motivation for writing my Green Cities book.

So, I have a vested interest in there being greater demand for urban economists but the broader intellectual point is to think about "fashion cycles" for sub-fields of academic economics. How do some fields such as development economics or behavioral economics rise up right now while other sub-fields are in less demand? How do economists decide "what is hot" and "what is not"? Maybe we should as Paris Hilton?

Saturday, November 25, 2006

A Superstar City Prepares to Grow

New York City is wrestling with the question of how to increase its supply of land that can be developed. As this article below stresses, brownfields offer one source of land that can be revitalized.

I find this article interesting and I'm impressed that the NYC political leaders are thinking ahead. I wonder about how Tort lawyers will wait for the first cancer case to emerge from some unlucky family living in a new housing complex built on top of one of these brownfields. How will liability work here? Who is libel in this case? Will developers be eager to build on these brownfields and will insurance companies offer policies?

While this article is not specific about how many new units of housing might be built, it challenges the claim made by Ed Glaeser and Joe Gyourko that New York City's high home prices are partially explained by zoning regulation making it quite costly to build new housing.

As I have driven around Queens and other NYC boroughs it is clear to me that there are plenty of single detached homes and two story row houses that could be bought up and knocked down and tall housing towers could be built in their place. Look at Brooklyn today near the water and the large towers going up.




November 26, 2006
Bloomberg Administration Is Developing Land Use Plan to Accommodate Future Populations
By SAM ROBERTS

Faced with a shrinking inventory of vacant land, the Bloomberg administration next month will unveil its goals for accommodating the city’s growing population over the next 25 years and the municipal services that nine million or more New Yorkers will require.

Chief among the priorities of the Mayor’s Sustainability Advisory Board, led by Daniel L. Doctoroff, the deputy mayor for economic development, is how to reclaim as many as 1,700 acres of polluted land — brownfields and other former industrial parcels — and transform them into environmentally sound sites for schools, apartments and parks.

Among the other goals being explored by the board, which Mayor Michael R. Bloomberg appointed in September, are improving commuting times (at 39 minutes on average, now the highest of any big city in the nation), maintaining and protecting the drinking water supply, reducing sewage overflow into the city’s surrounding waterways during stormy weather, and reconciling the region’s burgeoning energy needs with clean-air standards.

The board is expected to outline its draft agenda by the end of next month, then place it before civic groups and the public for further debate. In mid-2007, the mayor will present its final goals — and strategies to achieve them..

Driven by higher birth rates among Hispanic and Asian New Yorkers, the influx of about 100,000 immigrants a year and a housing boom that is attracting more newcomers to each borough, the city’s population hit another peak last year, of 8.2 million. If those trends persist, the population is projected to reach at least nine million in the 2020’s.

Responding to the mayor’s pledge in his State of the City address last January to produce a “strategic land use plan” to deal with a city of nine million people, Mr. Doctoroff said: “We have the capacity through rezoning and underutilized land to go well over that number. But you cannot simply divorce the issue of growth from the infrastructure required to support it.

“It opens up great opportunities only if the growth is smart,” he said at the time, explaining that growth that fosters economic development could, for instance, help balance the city’s budget.

“Growth for its own sake is not necessarily good,” Mr. Doctoroff said. “We have a structural budget deficit. You want to ensure that the growth that takes place helps to ameliorate that deficit rather than exacerbates it.”

City officials declined to publicly elaborate on their proposals in advance of the advisory board’s announcement. But some of its goals were foreshadowed by two of the largest rezoning revisions in city history — of the Brooklyn waterfront in Greenpoint and Williamsburg and the Far West Side of Manhattan.

Both major zoning changes, coupled with other development proposals, including the Atlantic Yards project in Brooklyn, were aimed at revitalizing underutilized land for economic development and to expand the city’s property tax base. The zoning changes were accomplished, in part, by tying them to the city’s timetable to apply for the 2012 Olympic Games.

But the city lost its Olympic bid, which included the ill-fated proposal for combining a stadium for the Olympics and the Jets with an expanded convention center on the West Side. But Mr. Doctoroff maintains that he also viewed the Olympics as a vehicle to drive the sort of longer-range planning in which local governments rarely have the resources, or the vision, to indulge.

“Ultimately the most important point was to finally create the conditions for the West Side to develop,” he said.

The stadium proposal itself generated the most criticism and even some of Mr. Doctoroff’s supporters said he did not do enough political spadework to see it through. But Mr. Bloomberg, among others, says that Mr. Doctoroff has not gotten his due.

“Doctoroff, by the time he gets done,” Mr. Bloomberg said recently, “will have a greater impact on this city, I think, than Robert Moses in a much more democratic world where there’s a lot more community input and a lot more supervision from the courts and the Legislature.”

Planners who were interviewed agreed that the biggest growth constraint that the city faced was accommodating nine million people within its boundaries. Unlike newer cities in the South or the West, New York is not empowered to expand by annexing neighboring communities. But the decline in manufacturing and in maritime uses of the waterfront, coupled with zoning changes that determine density, encouraged developers to recycle existing land.

The advisory board’s biggest challenge, Mr. Doctoroff said earlier this year, is “how are we going to generate the land in a city where land is our scarcest resource to provide for the facilities that will support growth.

“It could be a zoning change, an investment in a form of transportation, it could be park space, working with Con Ed and Keyspan on energy needs,” he continued. “Power plants take acres and acres of land, but if we’re going to grow we’ve got to provide that.”

One possible source is the inventory of 1,700 contaminated acres in the city, many of which, consultants to the advisory board believe, can be reclaimed through improved technology. The technology has to be efficient enough, however, to make whatever is built there affordable.

If the brownfields were sufficiently decontaminated, those sites alone could accommodate 2,600 schools.

“We can’t afford to write off the land,” one advisor to the board said.

According to a Department of City Planning survey of land use, one- and two-family homes consume more space than any other category — more than 27 percent — followed by 25 percent devoted to open space and recreation, 12 percent to multifamily housing, 4 percent each to commercial and industrial uses, and a little more than 1 percent to parking. About 7 percent of the land surveyed was identified as vacant.

Separately, transportation officials say streets and highways take up about 32,000 acres, which would rank them third in an overall inventory of land use.

Present Discounted Value Calculations and the Benefits of Fighting Climate Change

The Stern Report has generated world wide headlines. A leading economist argues that climate change will be quite costly perhaps costing us 5% of world GNP each year in the future. In this interesting report on the Stern Review, Bill Nordhaus argues that many of the Stern Report's findings are driven by its implicit assumption of a 0% interest rate in calculating the present discounted value of flows of benefits and costs associated with climate change.

While I give you the Nordhaus piece, I apologize for the strange way the footnotes get embedded in the text.

This issue of discounting returns to the "old" Robert Solow question on sustainability concerning whether physical capital and human capital substitutes for natural capital. If innovation (new ideas) and physical capital (better New Orleans Levees) can protect natural capital then it is important from a sustainability perspective to think about the next best alternative (i.e opportunity cost) for resources we spend on fighting climate change today.





http://www.econ.yale.edu/~nordhaus/homepage/SternReviewD2.pdf

The Stern Review on the Economics of Climate Change1

William Nordhaus
November 17, 2006

Opposite ends of the globe

It appears that no two global warming policies on earth are farther
apart than the White House and 10 Downing Street. In 2001, President G.W.
Bush announced his opposition to binding constraints on greenhouse gas
(GHG) emissions. In his letter of opposition, he stated, “I oppose the Kyoto
Protocol because it exempts 80 percent of the world, including major
population centers such as China and India, from compliance, and would
cause serious harm to the U.S. economy.” This policy, much like the war in
Iraq, was undertaken with no discernible economic analysis.2
In stark contrast, the British government in November 2006 presented a
comprehensive new study, the Stern Review on the Economics of Climate Change
1 The author is grateful for helpful comments by Scott Barrett, William Brainard,
Partha Dasgupta, Robert Stavins, Nicholas Stern, and John Weyant.
2
Text of a Letter from the President to Senators Hagel, Helms, Craig, and Roberts, March
13, 2001, http://www.whitehouse.gov/news/releases/2001/03/20010314.html
(downloaded November 13, 2006). There is no record of a fact sheet or other
economic analysis accompanying the letter. The Bush Administration’s economic
analysis was contained in the 2002 Economic Report of the President and the Council of
Economic Advisers, published almost a year after President Bush’s letter to the
Senators. The Economic Report’s analysis suggests that the Kyoto Protocol is costly,
but its analysis does not show that binding action is economically unwarranted.
2
(hereafter the Review).3 Prime Minister Tony Blair painted a dark picture for
the globe at its unveiling, “It is not in doubt that if the science is right, the
consequences for our planet are literally disastrous…. [W]ithout radical
international measures to reduce carbon emissions within the next 10 to 15
years, there is compelling evidence to suggest we might lose the chance to
control temperature rises.”4
The summary in the Review was equally stark: “[T]he Review estimates
that if we don’t act, the overall costs and risks of climate change will be
equivalent to losing at least 5% of global GDP each year, now and forever. If a
wider range of risks and impacts is taken into account, the estimates of
damage could rise to 20% of GDP or more.… Our actions now and over the
coming decades could create risks … on a scale similar to those associated
with the great wars and the economic depression of the first half of the 20th
century.”5
These results are dramatically different from earlier economic models
that use the same basic data and analytical structure. One of the major
findings in the economics of climate change has been that efficient or
“optimal” economic policies to slow climate change involve modest rates of
emissions reductions in the near term, followed by sharp reductions in the
3 All citations in this note were from the online version at http://www.hmtreasury.
gov.uk/independent_reviews/stern_review_economics_climate_change/
sternreview_index.cfm (downloaded various dates, November 2006).
4 PM's comments at launch of Stern Review, http://www.number-
10.gov.uk/output/Page10300.asp (downloaded November 13, 2006).
5 Review, Summary of Conclusions.
3
medium and long term. We might call this the climate-policy ramp, in which
policies to slow global warming increasingly tighten or ramp up over time.6
While seemingly counterintuitive, the findings about the climate-policy
ramp have survived the tests of multiple alternative modeling strategies,
different climate goals, alternative specifications of the scientific modules, and
more than a decade of revisions in integrated assessment models. The logic of
the climate-policy ramp is straightforward. In a world where capital is
productive, the highest-return investments are primarily in tangible,
technological, and human capital, including research and development in
low-carbon-emissions technologies. As societies become richer in the coming
decades, it becomes efficient to shift investments toward policies that intensify
the pace of emissions reductions and otherwise slow GHG emissions. The
exact mix and timing of emissions reductions depends upon details of costs,
damages, and the extent to which climate change and damages are
irreversible.
While scientists have sounded many somber warnings about the longterm
peril of unchecked climate change,7 the Review attempts to justify strong
6 This strategy was one of the major conclusions in a review of integrated-assessment
models: “Perhaps the most surprising result is the consensus that given calibrated
interest rates and low future economic growth, modest controls are generally
optimal.” David L. Kelly and Charles D. Kolstad, Integrated Assessment Models For
Climate Change Control,” Henk Folmer and Tom Tietenberg (eds.), International
Yearbook of Environmental and Resource Economics 1999/2000: A Survey of Current Issues,
Cheltenham, UK, Edward Elgar, 1999.
7
For a recent warning, see James Hansen, Makiko Sato, Reto Ruedy, Ken Lo, David
W. Lea, and Martin Medina-Elizade Global temperature change, Proceedings of the
National Academy of Sciences (US), 103, 2006, pp. 14288-14293.
4
current action in a cost-benefit economic framework.8 Because it has
conclusions that are so different from most economic studies, the present note
examines the reasons for this major difference. Is this radical revision of
global-warming economics warranted?
Overview of the Review
I will not summarize the basic findings of the Review – a clear summary
is found on its website. Instead, I begin with five summary reactions. First, the
Review is an impressive document, buttressed by more than a dozen
background studies. There is little new science or economics here, but it
provides many new syntheses of the extensive and rapidly growing literature.
While not as balanced and ponderously reviewed as the reports of the
Intergovernmental Panel on Climate Change (IPCC), it is much more current
than the latest IPCC report, published in 2001.9 For those seriously interested
in global warming, it is worth a few days’ study.
Second, while I question some of the Review’s modeling and economic
assumptions, its results are fundamentally correct in sign if not in size. The
approach taken in the Review – selecting climate-change policies with an eye
to balancing economic needs with environmental dangers – is solidly
grounded in mainstream economic analysis. By linking climate-change
policies to both economic and environmental objectives, the Review has
8 The early precursor of this Review is the study by William R. Cline, The Economics
of Global Warming, Washington, Institute for International Economics, 1992.
9 Contribution of Working Group I to the Third Assessment Report of the
Intergovernmental Panel on Climate Change, Climate Change 2001: The Scientific Basis,
J. T. Houghton, Y. Ding, D.J. Griggs, M. Noguer, P. J. van der Linden, and D. Xiaosu,
eds., Cambridge, Cambridge University Press, 2001.
5
corrected one of the fundamental flaws of the Kyoto Protocol, which had no
such linkage. By contrast, the parallel analysis of the Bush Administration,
cited in footnote 2 above, provided no support for the Bush Administration’s
rejection of binding emissions constraints on GHG emissions.
Third, the Review should be viewed as a political document. Its chief
author is Sir Nicholas Stern, who has had a distinguished career in academic
and government positions. Until 1993, he was a public-finance economist in
British universities specializing in taxation and economic development; today,
he is Head of the Government Economics Service and Adviser to the
Government. The disciplinary background of a public-finance economist is the
leitmotiv running through the chapters. However, it is not an academic study.
Like most government reports, the Review was published without an appraisal
of methods and assumptions by independent outside experts. But even the
analysis of HM Government needs peer review.
The fourth comment concerns the Review’s emphasis on the need for
increasing the price of carbon emissions. The Review summarizes its
discussion here as follows, “Creating a transparent and comparable carbon
price signal around the world is an urgent challenge for international
collective action.” In plain English, the Review argues that it is critical to have a
harmonized carbon tax or similar regulatory device both to provide incentives
to individual firms and households and to stimulate research and
development in low-carbon technologies. Carbon prices must be raised to
transmit the social costs of GHG emissions to the everyday decisions of
billions of firms and people. This simple yet inconvenient economic insight is
virtually absent from most political discussions of climate change policy
(including the marathon slide show by Al Gore in An Inconvenient Truth).
6
But these points are not the nub of the matter. Rather, and this is the
final comment, the Review’s radical revision arises because of an extreme
assumption about discounting. Discounting is a factor in climate-change
policy – indeed in all investment decisions – which involves the relative
weight of future and present payoffs. At first blush, this area would appear a
technicality that should properly be left to abstruse treatises and graduate
courses in economics. Unfortunately, it cannot be buried in a footnote, for
discounting is the central to the radical revision. The Review proposes using a
social discount rate that is essentially zero. Combined with other assumptions,
this magnifies enormously impacts in the distant future and rationalizes deep
cuts in emissions, and indeed in all consumption, today. If we were to
substitute more conventional discount rates used in other global-warming
analyses, by governments, by consumers, or by businesses, the Review’s
dramatic results would disappear, and we would come back to the climatepolicy
ramp described above. The balance of this discussion focuses on this
central issue.
The social discount rate: concepts and assumptions
Discounting involves a concept called the pure rate of social time
preference – I will call this “the social discount rate” for short. The social
discount rate is a parameter that measures the importance of the welfare of
future generations relative to the present. It is calculated in percent per year,
like an interest rate, but refers to the discount in future “utility” or welfare,
not future goods or dollars. A zero social discount rate means that future
generations into the indefinite future are treated equally with present
generations; a positive social discount rate means that the welfares of future
7
generations are reduced or “discounted” compared to nearer generations.
Philosophers and economists have conducted vigorous debates about how to
apply social discount rates in areas as diverse as economic growth, climate
change, energy policy, nuclear waste, major infrastructure programs such as
levees, and reparations for slavery.10
Discussions about discount rates need to respect the distinction
between the social discount rate and the discount rate on goods. The former
refers to the relative weights on different people or generations and is the
major source of concern in this note. The latter refers to discounts on bundles
of goods and is measured as a “real interest rate.” I discuss the connection
between these two concepts below.
The sections that follow examine the philosophical arguments about
intergenerational equity, how discounting affects the measurement of
damages, the role of discounting in economic modeling of climate change,
saving behavior, and behavior under uncertainty.
10 Many of the issues involved is discounting, particularly relating to climate change,
are discussed in the different studies in Paul Portney and John Weyant, Discounting
and Intergenerational Equity, Resources for the Future, Washington, D.C., 1999. Note
that the pure rate of social time preference differs from the real interest rate or the
discount rate on goods and services, which is in principle observed in the market
place. A useful summary is contained in K. J. Arrow, W. Cline, K.G. Maler, M.
Munasinghe, R. Squitieri, and J. Stiglitz, “Intertemporal equity, discounting and
economic efficiency,” in Climate Change 1995—Economic and Social Dimensions of
Climate Change, edited by J. Bruce, H. Lee, and E. Haites, 1996, Cambridge:
Cambridge University Press, pp. 125–44.
8
Philosophical questions about the social discount rate
At the outset, we should recall the warning that Tjalling Koopmans
gave in his pathbreaking analysis of discounting in growth theory. He wrote,
“[T]he problem of optimal growth is too complicated, or at least too
unfamiliar, for one to feel comfortable in making an entirely a priori choice of
[a social discount rate] before one knows the implications of alternative
choices.”11 This conclusion applies with even greater force in global warming
models, which have much greater complexity than the simple, deterministic,
stationary, two-input models that Koopmans analyzed.
The Review argues that it is indefensible to make long-term decisions
with a positive social discount rate. The conclusion of the approach is the
following, “The argument … and that of many other economists and
philosophers who have examined these long-run, ethical issues, is that [a
positive social discount rate] is relevant only to account for the exogenous
possibility of extinction.” (Annex to Chapter 2, p. 52) The argument is that a
high social discount rate would lead societies to ignore large costs that occur
in the distant future. The actual social discount rate used in the Review is 0.1
percent per year, which is only vaguely justified by extinction estimates; for
our purposes, it can be treated as near-zero.
11 Tjalling C. Koopmans, “On the Concept of Optimal Economic Growth,” in
Pontificiae Academiae Scientiarum Scripta Varia 28, 1, Semaine D'Etude sur Le Role de
L'analyse Econometrique dans la Formulation de Plans de Developpement, 1965, pp. 1-75
(available for download at http://cowles.econ.yale.edu/P/au/p_koopmans.htm.)
Zero discounting leads to deep mathematical problems such as non-convergence of
the objective function and incompleteness of the functional. For the analytical
background, see also Frank Ramsey, “A Mathematical Theory of Saving,” Economic
Journal, 1928, 38, pp. 543–559; David Cass, “Optimum Growth in an Aggregative
Model of Capital Accumulation,” Review of Economic Studies, 1965, 32, pp. 233–240.
9
The logic behind the Review’s social welfare function is not as conclusive
as it claims. The Review argues that fundamental ethics require
intergenerational neutrality using an additive separable logarithmic utility
function. Quite another ethical stance would be to hold that each generation
should leave at least as much total societal capital (tangible, natural, human,
and technological) as it inherited. This would admit a wide array of social
discount rates. A third alternative would be a Rawlsian perspective that
societies should maximize the economic well-being of the poorest generation.
Under this policy, current consumption would increase sharply to reflect likely
future improvements in productivity. Yet a fourth perspective would be a
precautionary (minimax) principle in which societies maximize the minimum
consumption along the riskiest path; this might involve stockpiling vaccines,
grain, oil, and water in contemplation of possible plagues and famines.
Without choosing among these positions, it should be clear that alternative
ethical perspectives are possible. Moreover, as I suggest below, alternative
perspectives provide vastly different prescriptions about desirable climatechange
policies.
Even if a low social discount is chosen, a second issue arises in the
calibration of the social discount rate to actual macroeconomic. Behind the
Review’s modeling is the assumption that the world economy is in long-run
equilibrium of a Ramsey optimal growth model. In a Ramsey equilibrium
with stable population, there are two observables – the rate of return on
capital and the rate of growth of consumption; and there are two normative
parameters – the social discount rate and the curvature of the utility function
(more precisely, the elasticity of the marginal utility of consumption). A
realistic analysis would also need to account for distortions in the tax system,
10
for uncertainties and risk premiums, and for the equity-premium puzzle, but
these complications can be ignored in the present context.
The Review assumes a relatively low curvature parameter (the
logarithmic utility function) along with the near-zero social discount rate.
However, in calibrating a growth model, the social discount rate and the
curvature parameter cannot be chosen independently if the model is designed
to match observable variables. A low curvature (such as in the logarithmic
utility function) implies a relatively high social discount rate. A high
curvature (represented by a high degree of risk aversion or a high aversion to
intergenerational inequality) implies a low or even negative social discount
rate. It turns out that the calibration of the utility function makes an enormous
difference to the results in global-warming models, as I show in the modeling
section below.
Measuring impacts with near-zero discounting
With these analytical points behind us, I next discuss the Review’s
estimates of the aggregate economic impacts. The Review concludes, “Putting
these three factors together would probably increase the cost of climate
change to the equivalent of a 20% cut in per-capita consumption, now and
forever.” This frightening statement suggests that the globe is perilously close
to driving off a climatic cliff in the very near future. However, this is an
unusual definition of consumption losses, and when the Review says that there
are substantial losses “now,” this does not mean “today.” The measure of
consumption used is the “balanced growth equivalents” of consumption.
Roughly speaking, with low discounting, this is the certainty equivalent of the
average annual consumption loss over the indefinite future. The measure is
11
akin to an annuity. In fact, the Review’s estimate of the output loss now, as in
“today,” appears to be zero.
If we look inside the impact boxes, we find some strange things. The
damage estimates are much higher than the standard estimates in the impact
literature. This probably occurs because of assumptions that tilt up the
damage curve: rapid economic growth forever, high economic damage
estimates, high climatic impacts of GHG accumulation, catastrophic risks,
adverse health impacts, yet higher sensitivity of the climate system, and an
adjustment for inequality across countries. Additionally, the Review drew
selectively from studies, emphasizing those with high damage estimates,
some of which are highly speculative. For example, the Review used estimates
from the study of Nordhaus and Boyer (see footnote 12 below) that projected
damages way beyond 2100; however, those authors noted that projections
beyond 2100 were particularly unreliable.
However, the major point is that these impacts are far into the future,
and the calculations depend critically upon the assumption of low
discounting. Take as an example the high-climate scenario with catastrophic
and non-market impacts. For this case, the mean losses are less than 1 percent
of world output in 2050, 2.9 percent in 2100, and 13.8 percent in 2200 (see
Figure 6.5d). Yet this somehow turns into a mean annual impact of 14.4
percent shown in Table 6.1, and after a few other gloomy ingredients are
stirred in, it becomes the “20% cut in per-capita consumption, now and
forever.”
How do damages, which average around 5 percent of output over the
next two centuries turn into a 14.4 percent reduction in consumption now and
12
forever? The answer lies in the way that near-zero discounting magnifies
distant impacts. With near-zero discounting, the low damages in the next two
centuries get overwhelmed by the long-term average over many centuries. We
can illustrate using the Review’s model discussed in Box 6.3. Suppose that
scientists discover that that a wrinkle in the climatic system will cause
damages equal to 0.01 percent of output starting in 2200 and continuing at
that rate thereafter.
How large a one-time investment would be justified today to remove the
wrinkle starting after two centuries? The answer is that a payment of 15 percent
of world consumption today (approximately $7 trillion) would pass the
Review’s cost-benefit test. This seems completely absurd. The bizarre result
arises because the value of the future consumption stream is so high with
near-zero discounting that we would trade off a large fraction of today’s
income to increase a far-future income stream by a very tiny fraction. This
bizarre implication reminds us of Koopmans’s warning quoted above to
proceed cautiously to accept theoretical assumptions about discounting before
examining their full consequences.
Hence, the damage puzzle is resolved. The large damages from global
warming reflect large and speculative damages in the far-distant future; the
impacts now, as in today, are small; and, as I will suggest below, the 20
percent cut in consumption from global-warming might be reduced by an
order of magnitude if alternative assumptions about discounting are used.
13
Economic modeling with low discount rates
I next apply these points in an empirical model of the economics of
global warming. To foreshadow the result, these calculations show that the
assumption of a near-zero social discount rate drives most of the economic
results in the Review.
It is virtually impossible for mortals outside the group that did the
modeling to understand the detailed results of the Review. It would involve
studying the economics and geophysics in several chapters, taking apart a
complex analysis (the PAGE model), and examining the derivation and
implications of each of the economic and scientific judgments.
The alternative approach followed here is to use a small and welldocumented
model of the economics of climate change to estimate the optimal
policy, and then to make parameter adjustments to parallel assumptions made
in the Review. For this purpose, I use the “DICE model,” which is an acronym
for a Dynamic Integrated model of Climate and the Economy. This model,
developed in the early 1990s, uses a simple dynamic representation of the
scientific and economic links among population, technological change, GHG
emissions, concentrations, climate change, and damages. The analytical
structure of the DICE model is identical to that in the Review. DICE calculates
the paths of capital investment and GHG reductions that maximize a social
welfare function, where the social welfare function is the discounted sum of
population-weighted utilities of per capita consumption. The DICE model
assumes a pure rate of social time preference starting at 3 percent per year and
declining slowly to about 1 percent per year in 300 years. The social discount
14
rate was chosen to be consistent with a logarithmic utility function, market
interest rates, and rates of private and public saving and investment. 12
For this analysis, I have updated the DICE model to 2005 data,
economics, science, and 2006 prices.13 I then make three runs, which are
explained as we proceed:
Run 1. Optimal climate change policy in the DICE-2006 model
Run 2. Optimal climate change using the Stern Review zero discount rate
Run 3. Optimal climate change using a recalibrated zero discount rate
Run 1. Run 1 is the Optimal climate change policy in DICE-2006. This run
takes the DICE-2006 model and calculates the optimal trajectory of climate
change policies as described above. This calculation leads to an optimal
carbon price in 2005 of $17.12 per ton C, rising over time to $84 in 2050 and
$270 in 2100. (The “optimal carbon price,” or carbon tax, sometimes called the
“social cost of carbon,” is the calculated price of carbon emissions that will
balance the incremental costs of reducing carbon emissions with the
12 Results and documentation of the DICE model are provided in William Nordhaus,
“An Optimal Transition Path for Controlling Greenhouse Gases,” Science, vol. 258,
November 20, 1992, pp. 1315-1319; William Nordhaus, Managing the Global Commons:
The Economics of Climate Change, MIT Press, Cambridge, Mass., 1994; William
Nordhaus and Zili Yang “A Regional Dynamic General-Equilibrium Model of
Alternative Climate-Change Strategies,”, American Economic Review, vol. 86, No. 4,
September 1996, pp. 741-765; William Nordhaus and Joseph Boyer, Warming the
World: Economic Modeling of Global Warming, MIT Press, Cambridge, Mass, 2000;
William Nordhaus, “Global Warming Economics,” Science, November 9, 2001, vol.
294, no. 5545, pp. 1283-1284.
13 Documentation of the changes in the DICE-2006 model and the GAMS computer
program for the DICE-2006 model are provided in William D. Nordhaus,
“Documentation for DICE-2006, November 2006 round,” November 17, 2006,
available at www.nordhaus.econ.yale.edu , under “Recent Stuff.”
15
incremental benefits of reducing climate damages.) The optimal rate of
emissions reduction is 6 percent in 2005, 14 percent in 2050, and 25 percent in
2100.14 This optimized path leads to a projected global temperature increase
from 2000 to 2100 of around 1.8 degrees C. While the findings of such
mainstream economic assessments may not satisfy the most ardent
environmentalists, if followed they would go far beyond current global
emissions reductions and would be a good first step on a journey of many
miles.
Run 2. The results of the standard DICE model just discussed are
completely different from those in the Review. The Review recommends a
social cost of carbon of $311 per ton C. This number is almost 20 times the
DICE model result. Based on calculations made in earlier publications (see
footnote 12), it seems likely that the major reason for the Review’s sharp
emissions reductions and high carbon price is the low social discount rate. I
therefore calculated run 2, Optimal climate change using the Stern Review zero
discount rate. The assumptions are the same as Run 1 except that the social
discount rate is changed to 0.1 percent per year. This dramatically changes the
trajectory of climate-change policy. The 2005 optimal carbon price in the DICE
model rises from $17.12 in Run 1 to $159 per ton C in Run 2.15 Efficient
emissions reductions in Run 2 are much larger – with emissions reductions of
14 The future numbers are the solutions to the model based on current information
and provide estimates of optimal future policies under current estimates of
parameters. They are not decisions that are taken today. They should be revised over
time as new scientific and economic information becomes available.
15 The social cost of carbon estimated in the Review is approximately two times
higher than the number calculated in Run 2. Because different models are used, it is
not possible to identify reasons for the discrepancy. Modeling results are extremely
sensitive to parameter changes when the discount rate is near-zero.
16
50 percent in 2015 – because future damages are in effect treated as occurring
today. The climate-policy ramp flattens out.
Run 3.An earlier section noted that alternative calibrations of the social
welfare function are consistent with observable variables. So the final run is
one in which assumes a low social discount rate but where the curvature
parameter is calibrated so that the economic growth path conforms to
observable variables. Some history might be helpful here. When the DICE
model was constructed fifteen years ago, I assumed logarithmic utility for
computational reasons – alternative utility functions would not converge
numerically. This calibration led to a social discount rate of 3 percent per year,
which was calibrated to match the growth of consumption, savings rates, and
market rates of return on capital. Because of improvements in computers and
software, we can now easily calibrate alterative utility functions. Experiments
with the DICE-2006 model indicate that a social discount rate of 0.1 percent
per year is consistent with a utility curvature parameter of 2.25. However, the
Review’s social discount rate of 0.1 percent per year is inconsistent with its
utility curvature assumption of 1.16 The Review’s calibration gives too low a
rate of return and too high a savings rate compared to macroeconomic data,
16 The discussion in the text assumes zero population growth. More generally, the
Ramsey-Cass-Koopmans steady-state optimal growth equilibrium equation is
r = ρ + αg + n, where r = the marginal product of capital, ρ = social discount rate,
α = elasticity of the marginal utility of consumption, g = growth of per capita
consumption, and n = rate of growth of population. Conceptually, the marginal
product of capital has the same units as the real interest rate, but entirely different
units from the social discount rate. To apply this equilibrium condition, assume that
the observable variables (in rates per year) are r = 0.05, n = 0.00, and g = 0.02. For this
simplest equation, if we assume that the social discount rate is ρ = 0 per year, then α
= 2.5. If we take the log-linear utility function of the Review together with the
observable variables in this footnote, then this implies that ρ = 0.03 per year. The
calibrations in DICE-2006 are slightly different from these equilibrium calculations
because of positive population growth and non-constant consumption growth, but
these equilibrium calculations given the flavor of the results.
17
but the alternative calibration proposed here fits the macroeconomic data
underlying the DICE model.
We can now rerun the DICE-2006 model with the near-zero social
discount rate and the associated calibrated curvature parameter derived in the
last paragraph. This is Run 3, Optimal climate change with recalibrated zero
discount rate. Run 3 looks very similar to Run 1, the standard DICE-2006 model
optimal policy. The first-period social cost of carbon in Run 3 is $19.55 per ton
C, slightly above Run 1. The recalibrated run looks nothing like Run 2, which
is the run that reflects the Review’s assumption. How can it be that Run 3, with
a near-zero social discount rate, looks so much like Run 1? The reason is that
the recalibrated social discount rate in Run 3 maintains the assumption of
productive capital, with a relatively high real interest rate in the near term.
This high return means that the logic of the climate-policy ramp continues to
hold even though the social choice function has been recalibrated to a zero
social discount rate. This calibration removes the cost-benefit dilemmas just
discussed as well as the savings and uncertainty problems discussed in the
next two sections.
Implications for saving and investment
I return for the balance of this note to the Review’s assumptions on both
social discount rate and utility curvature (the assumptions that underlie Run
2). One surprising implication of the Review’s social discount rate is the effect
on consumption and saving. If the Review’s philosophy were adopted as a
general policy, it would produce much higher overall saving as compared
with today. In Run 2 (Optimization with Stern discount parameter), the global net
savings rate almost doubles compared to the historical numbers or Run 1. This
18
implies that global consumption would be reduced by about 14 percent,
requiring a reduction of $6 trillion per year in current consumption.
Where would the consumption cuts come from? From India and Africa?
That hardly seems equitable. The higher investment would be more than five
times total overseas development aid of all countries today. Perhaps the
consumption should come from the wasteful Americans? This would be fourfifths
of current levels of consumption and many times the decline in the
Great Depression.
Aside from the question of who pays, we might wonder whether such a
large decline in current consumption today is desirable in a world where
average consumption is growing rapidly. The Review projects that per capita
consumption will grow at 1.3 percent per year over the next two centuries (p.
162). In 2006 dollars, this means that today’s per capita consumption of $7,600
would grow to $94,000 in 2200. Here perhaps is a shard of hope for the globe.
However, this growth also means that future climatic damages will
come out of a much higher level of income. For example, the high-damage
case is associated with a 13.8 percent decline in consumption in 2200 as
discussed above. This means that per capita consumption would grow from
$7,800 today to only $81,000 in 2200. Hence, the Review advocates reducing
current consumption to prevent the decline in consumption of future
generations that it projects to be much richer than today. While this might be
worth contemplating, it hardly seems ethically compelling.
Faced with these implications of the discounting assumption, advocates
of the Review policy might propose a “dual-discounting” approach – limiting
19
the scope of the low social discount rate to climate policy. In other words,
perhaps countries should choose global-warming policies assuming the nearzero
social discount rate, but leave the rest of the economy to operate with the
present high social discount rate. While this seems an attractive possibility, it
is in fact a roundabout way to slow climate change sharply. In effect, we are
using a low social discount rate to “prevent dangerous interference with the
climate system” (in the language of the Framework Convention on Climate
Change). If that is the reason, why not impose the limit directly? Instead of
using the near-zero social discount rate as an analytic subterfuge to slow
climate change, why not simply adopt policies that will directly keep climate
change below the dangerous threshold? Limiting climate change directly is
more efficient as well as more transparent.
Hair triggers and uncertainty
A further unattractive feature of the Review’s near-zero social discount
rate is that it puts present decisions on a hair-trigger in response to far-future
contingencies. Under conventional discounting, contingencies many centuries
ahead have a tiny weight in today’s decisions. Decisions focus on the near
future. With the Review’s discounting procedure, by contrast, present
decisions become extremely sensitive to uncertain events in the distant future.
We saw above how an infinitesimal impact on the post-2200 income
stream could justify a large consumption sacrifice today. We can use the same
example to illustrate how far-future uncertainties are magnified by low
discount rates. Suppose that we suddenly learn that there is a 10 percent
probability of the wrinkle in the climatic system that reduces the post-2200
income stream by 0.01 percent. What insurance premium would be justified
20
today to reduce that probability to zero? With conventional discount rates, we
would probably ignore any tiny wrinkle two or three centuries ahead. If we
did a careful calculation using conventional discount rates, we would
calculate a breakeven 0.0002 percent insurance premium to remove the year-
2200 contingency, and a 0.0000003 percent premium for the year-2400
contingency. Moreover, these dollar premiums are small whether the
probability is large or small.
With the Review’s near-zero discount rate, offsetting the low-probability
wrinkle would be worth an insurance premium today of almost 2 percent of
current income, or $1 trillion. We would pay almost the same amount if that
threshold were to be crossed in 2400 rather than in 2200. Because the future is
so greatly magnified by a near-zero social discount rate, policies would be
virtually identical for different threshold dates. Moreover, a small refinement
in the probability estimate would trigger a large change in the dollar premium
we would pay. We are in effect forced to make current decisions about highly
uncertain events in the distant future even though these estimates are highly
speculative and are almost sure to be refined over the coming decades.
While this feature of low discounting might appear benign in climatechange
policy, we could imagine other areas where the implications could
themselves be dangerous. Imagine the preventive war strategies that might be
devised with low social discount rates. Countries might start wars today
because of the possibility of nuclear proliferation a century ahead; or because
of a potential adverse shift in the balance of power two centuries ahead; or
because of speculative futuristic technologies three centuries ahead. It is not
clear how long the globe could long survive the calculations and machinations
21
of zero-discount-rate military powers. This is yet a final example of a
surprising implication of a low discount rate.
Summary verdict
How much and how fast should the globe reduce greenhouse-gas
emissions? How should nations balance the costs of the reductions against the
damages and dangers of climate change? The Stern Review answers these
questions clearly and unambiguously: we need urgent, sharp, and immediate
reductions in greenhouse-gas emissions.
I am reminded here of President Harry Truman’s complaint that his
economists would always say, on the one hand this and on the other hand
that. He wanted a one-handed economist. The Stern Review is a Prime
Minister’s dream come true. It provides decisive and compelling answers
instead of the dreaded conjectures, contingencies, and qualifications.
However, a closer look reveals that there is indeed another hand to
these answers. The radical revision of the economics of climate change
proposed by the Review does not arise from any new economics, science, or
modeling. Rather, it depends decisively on the assumption of a near-zero
social discount rate. The Review’s unambiguous conclusions about the need for
extreme immediate action will not survive the substitution of discounting
assumptions that are consistent with today’s market place. So the central
questions about global-warming policy – how much, how fast, and how costly
– remain open. The Review informs but does not answer these fundamental
questions.

Friday, November 17, 2006

“Most economics departments are like country clubs” says a Nobel Laureate. Is This True?

“Most economics departments are like country clubs,” said James J. Heckman, a Chicago faculty member and Nobel laureate. “But at Chicago you are only as good as your last paper.” This quote is from Milton Friedman's obituary in today's New York Times.
see http://www.nytimes.com/2006/11/17/business/17friedman.html

I am not an expert on "deconstructionism" but it is still interesting to think about what this quote means. I've never been a member of a country club. I haven't stepped foot in such a club in 25 years.

So, what is Jim Heckman saying about every other department besides for Chicago?

One Scenario:

Department members engage in a "don't rock the boat" strategy. In this repeat prisoner's dilemma game people are nice to each other and do not mock each other's low productivity. Older tenured guys can rest on their laurels that in the 1970s they wrote some top journal papers. Now they can relax and consult and take up hobbies such as football watching. Young members of the department look at the old members and get a glimpse of their future. They can work hard when young , get tenured and then rest.

Heckman's point may be that from the Deans' perspective this is not optimal. A more obnoxious culture could shame smart people to keep writing academic papers and not rest on their past stock of output.

A possible counter claim is that in a friendly department, colleagues speak to each other more and that some of these social interactions improve the research being conducted.

If at Chicago, you are only judged on your last paper does this implicit tax lead researchers to write too few papers? If I know that my reputation hinges on this paper, could someone hide their work too long? This gets into the question of how you write a good paper? What is the production function? When do you allow your peers to see your work? will they engage and think about it?

I do think that Jim's quote accurately depicts my Chicago experience. It was not a very friendly place, it was a very serious place. Can a department have the best of both worlds being a happy, "fun" productive place? I think that both Harvard and MIT and Tufts would claim yes!

Thursday, November 16, 2006

Milton Friedman’s Impact on my Chicago Training in Economics

I entered the University of Chicago’s PHD program in Economics in the fall of 1988. Milton Friedman had moved west more than ten before that. His ongoing impact on that Department was clearly visible in the subject matter taught by Gary Becker, Sherwin Rosen, and Bob Lucas. The unique Chicago seminar style must have been inherited from his and George Stigler’s workshops. In such seminars, the speaker must fight to be heard as multiple seminar participants have points they urgently want to make.

I wanted to add some discussion that Friedman offered when he was 90 years old at a November 2002 conference at the University of Chicago. He offered some sharp comments on frontier research that people presented that built on his work from 40 years before. Now that’s a sharp mind!
I couldn’t find a link to it so instead I report the introduction of his Nobel Prize lecture from 30 years ago.

http://nobelprize.org/nobel_prizes/economics/laureates/1976/friedman-lecture.pdf


INFLATION AND UNEMPLOYMENT
Nobel Memorial Lecture, December 13, 1976
by MILTON FRIEDMAN

The University of Chicago, Illinois, USA

When the Bank of Sweden established the prize for Economic Science in
memory of Alfred Nobel (1968), there doubtless was - as there doubtless still
remains - widespread skepticism among both scientists and the broader public
about the appropriateness of treating economics as parallel to physics, chemistry,
and medicine. These are regarded as “exact sciences” in which objective,
cumulative, definitive knowledge is possible. Economics, and its fellow social
sciences, are regarded more nearly as branches of philosophy than of science
properly defined, enmeshed with values at the outset because they deal with
human behavior. Do not the social sciences, in which scholars are analyzing
the behavior of themselves and their fellow men, who are in turn observing and
reacting to what the scholars say, require fundamentally different methods of
investigation than the physical and biological sciences? Should they not be
judged by different criteria?
1. SOCIAL AND NATURAL SCIENCES
I have never myself accepted this view. I believe that it reflects a misunderstanding
not so much of the character and possibilities of social science as of
the character and possibilities of natural science. In both, there is no “certain”
substantive knowledge; only tentative hypotheses that can never be “proved”,
but can only fail to be rejected, hypotheses in which we may have more or less
confidence, depending on such features as the breadth of experience they
encompass relative to their own complexity and relative to alternative hypotheses,
and the number of occasions on which they have escaped possible
rejection. In both social and natural sciences, the body of positive knowledge
grows by the failure of a tentative hypothesis to predict phenomena the
hypothesis professes to explain; by the patching up of that hypothesis until
someone suggests a new hypothesis that more elegantly or simply embodies
the troublesome phenomena, and so on ad infinitum. In both, experiment is
sometimes possible, sometimes not (witness meteorology). In both, no experiment
is ever completely controlled, and experience often offers evidence that
is the equivalent of controlled experiment. In both, there is no way to have a
self-contained closed system or to avoid interaction between the observer and
the observed. The Gödel theorem in mathematics, the Heisenberg uncertainty
principle in physics, the self-fulfilling or self-defeating prophecy in the social
sciences all exemplify these limitations.
Of course, the different sciences deal with different subject matter, have
different bodies of evidence to draw on (for example, introspection is a more
important source of evidence for social than for natural sciences), find different
techniques of analysis most useful, and have achieved differential success in
predicting the phenomena they are studying. But such differences are as great
among, say, physics, biology, medicine, and meteorology as between any of
them and economics.
Even the difficult problem of separating value judgments from scientific
judgments is not unique to the social sciences. I well recall a dinner at a
Cambridge University college when I was sitting between a fellow economist
and R. A. Fisher, the great mathematical statistician and geneticist. My fellow
economist told me about a student he had been tutoring on labor economics,
who, in connection with an analysis of the effect of trade unions, remarked,
“Well surely, Mr. X (another economist of a different political persuasion)
would not agree with that.” My colleague regarded this experience as a terrible
indictment of economics because it illustrated the impossibility of a value-free
positive economic science. I turned to Sir Ronald and asked whether such an
experience was indeed unique to social science. His answer was an impassioned
“no”, and he proceeded to tell one story after another about how accurately
he could infer views in genetics from political views.
One of my great teachers, Wesley C. Mitchell, impressed on me the basic
reason why scholars have every incentive to pursue a value-free science, whatever
their values and however strongly they may wish to spread and promote
them. In order to recommend a course of action to achieve an objective, we
must first know whether that course of action will in fact promote the objective.
Positive scientific knowledge that enables us to predict the consequences of a
possible course of action is clearly a prerequisite for the normative judgment
whether that course of action is desirable. The Road to Hell is paved with
good intentions, precisely because of the neglect of this rather obvious point.
This point is particularly important in economics. Many countries around
the world are today experiencing socially destructive inflation, abnormally
high unemployment, misuse of economic resources, and, in some cases, the
suppression of human freedom not because evil men deliberately sought to
achieve these results, nor because of differences in values among their citizens,
but because of erroneous judgments about the consequences of government
measures: errors that at least in principle are capable of being corrected by
the progress of positive economic science.
Rather than pursue these ideas in the abstract [I have discussed the methodological
issues more fully in (l)], I shall illustrate the positive scientific character
of economics by discussing a particular economic issue that has been a
major concern of the economics profession throughout the postwar period;
namely, the relation between inflation and unemployment. This issue is an
admirable illustration because it has been a controversial political issue
throughout the period, yet the drastic change that has occurred in accepted
professional views was produced primarily by the scientific response to experience
that contradicted a tentatively accepted hypothesis - precisely the
classical process for the revision of a scientific hypothesis.
I cannot give here an exhaustive survey of the work that has been done on
this issue or of the evidence that has led to the revision of the hypothesis. I
shall be able only to skim the surface in the hope of conveying the flavor of
that work and that evidence and of indicating the major items requiring further
investigation.
Professional controversy about the relation between inflation and unemployment
has been intertwined with controversy about the relative role of
monetary, fiscal, and other factors in influencing aggregate demand. One issue
deals with how a change in aggregate nominal demand, however produced,
works itself out through changes in employment and price levels; the other,
with the factors accounting for the changes in aggregate nominal demand.
The two issues are closely related. The effects of a change in aggregate
nominal demand on employment and price levels may not be independent of
the source of the change, and conversely the effect of monetary, fiscal, or other
forces on aggregate nominal demand may depend on how employment and
price levels react. A full analysis will clearly have to treat the two issues jointly.
Yet there is a considerable measure of independence between them. To a
first approximation, the effects on employment and price levels may depend
only on the magnitude of the change in aggregate nominal demand, not on its
source. On both issues, professional opinion today is very different than it was
just after World War II because experience contradicted tentatively accepted
hypotheses. Either issue could therefore serve to illustrate my main thesis.
I have chosen to deal with only one in order to keep this lecture within
reasonable bounds. I have chosen to make that one the relation between inflation
and unemployment, because recent experience leaves me less satisfied
with the adequacy of my earlier work on that issue than with the adequacy
of my earlier work on the forces producing changes in aggregate nominal
demand.

Wednesday, November 15, 2006

Is Los Angeles Dangerous for Kahn? A test of the "Law of Small Numbers"

Don't ask how I found the story I reprint below. Another guy named Matt Kahn who currently lives in Los Angeles was assaulted as he made the mistake of supplying replacement workers in the middle of strike. The "law of small numbers" claims that we are not good bayesians. When we update our priors about something such as "Is Los Angeles a safe city", that we place too much weight on salient data points. In this case, the salient data point is that a guy with my same name got beat up in a city that I'm moving to in 6 weeks.

I should ask UCLA for a pay raise to compensate me for this person specific risk!


International Union Hierarchy Sued For Viciously Assaulting Worker
National legal foundation to assist victim of union violence
E-mail this page | Print this page

Los Angeles, Calif. (May 6, 2002) — With the help of the National Right to Work Foundation, Matthew Kahn filed suit today in Los Angeles County Superior Court against the Union of Needletrades, Industrial and Textile Employees (UNITE), for damages incurred during a vicious union beating following a 2001 strike.

The lawsuit alleges that on May 18, 2001, UNITE organizer Ramiro Hernandez and several union militants attacked Kahn in the parking lot of Labor Ready’s office in Commerce, giving him a concussion and several gashes on his head.

According to the complaint, the union brass bailed Hernandez, a long time union organizer, out of jail after the assault. Later investigation showed that Hernandez possesses an extensive arrest record for union-related activities.

“These thugs must be made to pay for their cowardly assault on an innocent man,” said Stefan Gleason, Vice President of the National Right to Work Foundation.

The problems began in March 2001, when UNITE Local 482 began a strike against Hollander Home Fashions. Over the next two months, UNITE union official Ramiro Hernandez continually harassed Matthew Kahn, a branch manager for Labor Ready’s office in Commerce. Kahn was responsible for providing replacement workers during the strike. UNITE and its local affiliates were aware that Hernandez had numerous prior arrests for strike-related violence, and they have provided financial support to help Hernandez escape any punishment for his violent actions.

“By encouraging and supporting Hernandez and his goons, the top brass of UNITE are directly responsible for letting this happen,” stated Gleason.

Unfortunately, this is not an isolated incident. The National Institute for Labor Relations has recorded almost 10,000 media-reported incidents of union violence since 1975. Experts on labor- and strike-related violence estimate that unreported acts of harassment could swell that figure to 100,000 or more.

The National Right to Work Legal Defense Foundation is a nonprofit, charitable organization providing free legal aid to employees whose human or civil rights have been violated by compulsory unionism abuses. The Foundation, which can be contacted toll-free at 1-800-336-3600, is assisting over 250,000 employees in over 200 cases nationwide.

Harvard's Effort to Mitigate Climate Change

Below, I report a concise punchy editorial in today's Harvard Crimson. The unsigned authors are pretty good economists thinking through different policies and their likely cost effectiveness. This editorial raises the bigger issue of adoption of greenhouse gas policies by Universities in general.

Universities are non-profits and cutting greenhouse gas emissions requires upfront expenditures. Will Universities with higher endowments do more upfront reductions in greenhouse gases? Will Universities in "Red States" do less? Will such universities be more likely to free ride? Or does the decision hinge on whether the President is a green or whether the student body is politically active? Will we see imitation across universities such that if a "leader" such as Harvard takes a pro-active step will other schools such as Columbia imitate it?

Harvard Crimson

Opinion
The Uninformed Vote
A vague resolution on cutting greenhouse gas emissions does not belong on the UC ballot
Published On 11/15/2006 4:09:17 AM
By THE CRIMSON STAFF

Cutting greenhouse gas emissions is good. But at what cost? The Environmental Action Committee (EAC) is asking the Faculty of Arts and Sciences (FAS) to cut its emissions (through on-campus energy conservation and alternative energy purchases, for example) and would like the student voice behind it. The EAC has crafted a resolution that calls on FAS to “reduce its greenhouse gas emissions to a level 11 percent below total emissions in 1990 by the year 2020”—a bit more than the level mandated by the 1997 Kyoto Protocol on the national level. On Sunday, the Undergraduate Council (UC) will vote on whether this resolution will appear as a referendum on next month’s UC presidential election ballot. Although putting the measure to a student vote is well-intentioned, the referendum does not belong on the ballot this fall.

The EAC has proposed a multi-prong strategy to reduce emissions but has only a rough idea of how much the entire project would cost. While some initiatives, like renovating buildings to be more energy efficient, may eventually save FAS money, others, like purchasing electricity from a wind farm, will not. Because the EAC is still assembling its plan, it cannot currently provide a firm cost estimate.

Regardless of the specifics of the plan, we expect that Harvard will need to front a significant sum to begin reducing emissions. Simply put, renovating historic buildings is not cheap. and it is unclear where such funding might come from. Moreover, students have a right to know, before they vote, if this referendum will mean higher tuition or termbill fees, if it might mean funding cuts in other areas of FAS (and what areas they might be), or if private and government grants or loans could be used to jumpstart longer-term efforts. Right now, students do not have the information available to make an informed decision on the costs and benefits of a long-term greenhouse gas emissions reduction effort on campus.

Recognizing the clear danger that climate change poses to the earth, this page has previously asked Harvard to be a leader in the field of energy research and sustainable energy generation. We are not yet convinced, however, that a costly effort to reduce Harvard’s own emissions is the best way to do so. But Harvard has other tools at its disposal. We believe that, dollar for dollar, Harvard may be able to make a greater contribution to energy conservation by funding research than by cutting its own emissions.

This is not to say that making the Harvard campus more efficient is a completely unworthy goal. Already, several environmental advocacy groups at Harvard—including the EAC, the Harvard Green Campus Initiative, and the Resource Efficiency Program—undertake various initiatives that save the University hundreds of thousands of dollars each year through reduced energy consumption. But these projects are either small in their expense or large in their monetary savings (or both); a vow to reduce FAS-wide emissions by 11 percent will certainly bring expenses but will not guarantee savings. By this spring the EAC expects to have data from a more detailed inventory of Harvard’s greenhouse gas emissions—and, in turn, a better idea of the costs of reducing emissions. Without that information, however, it will be impossible to assemble a more concrete proposal for emissions cuts, and so until then it is impossible to judge the merits of any plan to reduce emissions.

The UC needs to choose wisely the referenda that it places on its ballot. Referenda on students’ preferences on vague issues do not have a place in a campus election. Without knowing the cost of this proposal and who will pay for it, students can only voice their support for the idea of reducing emissions. Until more information is available to students, this referendum belongs in a basket of well intentioned ideas, but not on a UC ballot.

http://www.thecrimson.com/article.aspx?ref=515764

Tuesday, November 14, 2006

Two New Environmental Economics Papers

Apparently, we don't get paid for blogging! So, I've sat down and written some new academic stuff. This blog entry is a paid advertisement to tell you about these exciting two new papers available at www.ssrn.com.


Do Greens Drive Hummers or Hybrids? Environmental Ideology as a Determinant of Consumer Choice and the Aggregate Ecological Footprint

Abstract

The environmental movement has been an effective interest group lobbying governments to enact policies that enhance public goods (the environment) at the expense of tightly organized for profit interests. Collectively, environmentalists have been able to overcome free-rider problems to achieve their goals. But, in day to day life do environmentalists “free ride”? Do they live a “green” lifestyle? This paper uses several California data sets to test for differences in consumption patterns between greens and browns. I document that a community’s share of Green Party registered voters is a viable proxy for community environmentalism. Environmentalists are more likely to commute by public transit, purchase hybrid vehicles, and consume less gasoline than non-environmentalists. These observed differentials have aggregate implications for explaining why some nations lie below the cross-national Environmental Kuznets Curve.



Environmentalists Offer a Big Push for New Green Products:
Evidence from Hybrid Vehicle Registrations

Abstract

Corporations are constantly introducing new products for sale. Some new products, such as hybrid vehicles, are greener than incumbent products. Certain environmental externalities could be mitigated if there were guinea pigs willing to commit to purchase new green products. In the presence of fixed costs of production, firms will be more likely to produce such products if they anticipate that there is demand. This paper examines the role of environmentalists as “guinea pigs” who purchase unproven new products that are likely to offer social environmental benefits. I focus on the diffusion of registered Hybrid vehicles across Los Angeles census tracts between 2001 and 2005. When this “green” product is first introduced, environmentalists are early purchasers. As the price of gasoline rises in 2004 and 2005, “brown” communities increase their purchases.


Why did I write these papers? In the case of the second paper, I'm genuinely interested in "free market" environmentalism and the big question of whether for profit producers can make money developing and marketing green products.

In the case of the first paper, I'm interested in consumer heterogeneity and I figured that environmentalism is a distinguishing feature of some people and I wanted to explore if this attribute has explanatory power.

As usual, both of these papers are empirical papers. I have followed my typical approach of collecting several data sets and testing hypotheses of interest.

Sunday, November 12, 2006

Finding "Green" Neighbors and Product Bundling

I'm getting ready to move to Los Angeles and I don't drive. Apparently, I'm in the minority here. In a world where people differ with respect to their tastes for "new urban" living, how do greens co-ordinate and find each other? In Portland, new condo developments are being constructed close to rail transit stations and these buildings offer NO parking. Such buildings will self-select Greens to live there.

Developers must recognize that the characteristics of the buildings they design embody not only the physical location of the building in a city and the structural attributes of each housing unit. In addition, there is an "emergent property" --- who lives in the neighboring apartments? Is the typical neighbor a "yuppie"? A "truck driver"? Greens may be willing to pay a price premium for such a new urban unit if they can be guaranteed that their neighbors will have similar values as them. The absence of parking may help to achieve this goal. The interesting economic issue here is the co-ordination problem of how do strangers with similar tastes who want to form a sub-community find each other in a world of weirdos and search costs?


Here is the New York Times

November 12, 2006
National Perspectives
No Parking: Condos Leave Out Cars
By LINDA BAKER

PORTLAND, Ore.

ANNEMIEKE CLARK and her boyfriend, Daniel Pasley, do not spend a lot of time driving. Ms. Clark, a 29-year-old nursing student at Oregon Health and Science University, takes the bus to school. Her boyfriend is a “crazy bike rider,” she said.

So when they decided to buy their first home last winter, they chose a one-bedroom unit in the Civic, one of the first new developments in Portland to market condominiums without parking spaces.

Ms. Clark said they bought the $175,000 condo, which will be ready next summer, because “it was absolutely the cheapest one selling.” Mr. Pasley also hoped a unit without parking would inspire Ms. Clark to sell her 1992 Subaru.

“So, part of it was idealism — that we would get rid of the car,” Ms. Clark said.

Although condominiums without parking are common in Manhattan and the downtowns of a few other East Coast cities, they are the exception to the rule in most of the country. In fact, almost all local governments require developers to provide a minimum number of parking spaces for each unit — and to fold the cost of the space into the housing price.

The exact regulations, which are intended to prevent clogged streets and provide sufficient parking, vary by city. Houston’s code requires a minimum of 1.33 parking spaces for a one-bedroom and 2 spaces for a three-bedroom. Downtown Los Angeles mandates 2.25 parking spaces per unit, regardless of size.

Today, city planners around the country are trying to change or eliminate these standards, opting to promote mass transit and find a way to lower housing costs.

Minimum parking requirements became popular in the 1950s with the growth of suburbia, said Donald Shoup, a professor of urban planning at the University of California at Los Angeles and the author of “The High Cost of Free Parking” (American Planning Association, 2005). “They spread like wildfire,” he said.

But in the 21st century, skyrocketing housing prices and the move toward high-density urban development are bringing scrutiny to the ways in which cities and developers manage the relationship between parking and residential real estate. Once a tool of government, parking requirements are increasingly driven by the market.

Last year, for example, Seattle reduced parking requirements for multifamily housing in three of the city’s major commercial corridors. Next month, the City Council will vote on a proposal to eliminate minimum parking requirements in Seattle’s six core urban districts and near light-rail stations. In June, San Francisco replaced minimum requirements downtown with maximum standards allowing no more than 0.75 parking spaces per unit. In Portland, where central city parking minimums were eliminated six years ago, developers are breaking ground on projects with restricted parking.

“In the future,” Dr. Shoup said, “we will look back at minimum parking requirements as a colossal mistake. Change will be slow, but it’s happening now.”

The Civic, a 261-unit project, includes 24 condos without parking. The building is six blocks from downtown and near a major bus and light-rail line, and will offer residents a rental-car-sharing arrangement.

“We’re always looking for ways to promote smart growth,” said Tom Cody, a project manager of the Gerding/Edlen Development Company, which developed the Civic. “We decided to test the water and see if there was a market for units without parking spaces.” The 24 condos sold out, he said.

In San Francisco, more downtown housing has been approved over the last few years than in the last 20 years combined, said Joshua Switzky, a city planner. The booming real estate market there inspired local officials to revoke minimum-parking requirements in the central core, Mr. Switzky said. “The city’s modus operandi is ‘transit first,’ ” he said. “Everyone recognized the existing rules didn’t match the policy.”

Under San Francisco’s new parking maximums, downtown developers are also required to “unbundle” the price of parking from the price of the condo. “Buyers aren’t obligated to buy a parking space, and developers don’t have the incentive to build spaces they can’t sell,” Mr. Switzky said.

Sustainable development is not the only factor driving changes to parking standards. “We talk about affordable housing as the most critical thing facing cities and the nation,” Mr. Cody said. “But we never talk about the costs of the automobile.” Since individual parking spaces cost about $40,000, reducing or eliminating parking is an effective way to lower housing prices, he said.

At the Moda condominiums, a development under construction in Seattle, only 43 out of 251 units have assigned parking. Eighty-three units have no parking and the remainder have access to a permit parking system. The building is in the downtown Belltown neighborhood, where the average condo has one and a half parking spaces.

“I wanted the least expensive unit,” said Mary Stonecypher-Howell, a computer database specialist who bought a Moda studio without parking for $170,000. Ms. Stonecypher-Howell said it was the only downtown condo she could find for less than $200,000. “In the city, it’s simpler not to have a car,” she said. Moda units with parking cost about $30,000 more than units without.

Lenders traditionally balk at financing projects without parking, said David Hoy, who developed the Moda condos. The concern is that they would be difficult to resell. “But in a high-density urban environment, there’s a strong demand and a shortage of supply,” Mr. Hoy said. Moda, which is financed by United Commercial Bank, sold out in less than a week, he said.

Other cities are also reconsidering parking standards. In Houston, for example, a committee is reviewing parking minimums along the light-rail line, according to Suzy Hartgrove, a spokeswoman for the city’s planning and development department.

But not everybody is enthusiastic about the piecemeal changes taking place around the country, especially because often-arcane parking codes vary from district to district and city to suburb.

In the Rincon Hill neighborhood of San Francisco, where the new luxury tower One Rincon Hill is selling for $1,000 a square foot, parking standards allow a maximum of one space per unit. Just a few blocks away, downtown requirements undercut that figure by a quarter, making One Rincon Hill more attractive to buyers with cars.

“It gives them a marketing advantage,” said Victor Gonzalez, director of development for Monahan Pacific, a local company that has built condo properties downtown. “You’d be killed if you tried to do a project in the suburbs without parking,” he added.

Others point to the free-market parking situation in Manhattan, where monthly rates now exceed $500 a month.

Planners are undeterred. In the United States, “housing is expensive and parking is cheap,” Dr. Shoup said. “We’ve got it the wrong way around.”