Friday, November 06, 2009

Roger Pielke Jr. on climate change mitigation

Yesterday I heard Roger Pielke Jr. speak twice at Arizona State University, first in a talk to the Consortium for Science, Policy and Outcomes (CSPO) on climate change mitigation, and second in a class lecture on climate change adaptation. This post is about the former.

His talk was entitled "The Simple Math of Emissions Reduction," and began with a quote from Steve Raymer of Oxford University:
Wicked Problems
have Clumsy Solutions
requiring Uncomfortable Knowledge
which he then followed up with a slide on "Where I stand," which included the following bullet points (nearly, but probably not exactly verbatim):
  • Strong advocate for mitigation and adaptation policies
  • Continuing increase in atmospheric CO2 could pose large risks, as described by IPCC
  • Stabilizing concentrations at low levels can’t succeed if we underestimate the challenge (and we have)
  • Mitigation action will not result from elimination of all scientific uncertainty
  • Poisonous politics of the climate debate serves to limit a broader discussion of options
  • Ultimately technological innovation will precede political action, not vice versa
Regarding the IPCC, he says he has no debate with working group I on the science, some disagreements with working group II on impacts, adaptation, and vulnerability, and lots of debate with working group III on economics and mitigation, which this talk covers.

His slide for the outline of his talk looked like this:
  • Understanding the mitigation challenge
  • Where do emissions come from?
  • Decarbonization
  • The UK as a cautionary tale for U.S. policymakers
  • The U.S. situation and Waxman-Markey/Boxer-Kerry
  • How things might be different
Understanding the mitigation challenge

Although climate change involves other greenhouse gases besides CO2, he focused on CO2 and in this part of the talk gave a summary of CO2 accumulation in the atmosphere as a stock and flow problem, using a bathtub analogy. The inflow of CO2 into the atmosphere is like water pouring out of the faucet, there's outflow going out the drain, and the water in the tub is the accumulated CO2 in the atmosphere. The inflow is about 9 GtC (gigatons of carbon) per year and growing, and expected to hit 12 GtC per year by 2030. The current stock is a concentration of about 390 parts per million (ppm), increasing by 2-3 ppm/year. And the outflow is a natural removal of about 4 GtC/year. To stop the stock increase, the amount going in has to equal the amount going out. If we reach an 80% reduction in emissions by 2050, that is expected to limit the stock to 450 ppm.

Emissions have been growing faster than expected by the IPCC in 2000, with a 3.3% average increase per year between 2000 and 2007. While the economic slump has reduced emissions in 2009, it's expected that recovery and continued growth in emissions will occur.

Where do emissions come from?

Pielke used the following four lines to identify policy-relevant variables:
People
engage in economic activity that
uses energy
from carbon-emitting generation
The associated variables:
Population (P)
GDP per capita (GDP/P)
Energy intensity of the economy (Total Energy (TE)/GDP)
Carbon intensity of energy (C/TE)
The total carbon emissions = P * GDP/P * TE/GDP * C/TE.

This formula is known as the "Kaya Identity."

The policy tools available to reduce emissions by affecting these variables are: (1) population management to end up with fewer people, (2) limit the generation of wealth to have a smaller economy, (3) do the same or more with less energy by increasing efficiency, and (4) switch energy sources to generate energy with less emissions.

And that's it. Cap-and-trade, carbon taxes, etc. are designed to influence these variables.

Pielke then combined the first two variables (P * GDP/P) to get GDP, and the second two (TE/GDP * C/TE) he identified as Technology.

He argued that reducing GDP or GDP growth is not a policy option, so Technology is the only real policy option. Regarding the former point, he put up a graph very much like the Gapminder.org graph of world income, and observed that the Millennium Development Goals are all about pushing the people below $10/day--80% of the world's population--on that graph to the right. Even if all of the OECD nations were removed from the graph, there would still be a push to increase the GDP for the remainder and there would still be growing emissions.

He quoted Gwyn Prins regarding the G8 Summit to point out how policy makers are conflicted--they had a morning session on how to reduce gas prices for economic benefit, and an afternoon session on how to increase gas prices for climate change mitigation.

With this kind of a conflict, Pielke said, policy makers will choose GDP growth over climate change.

So that leaves Technology as an option, and he turned to the topic of decarbonization.

Decarbonization

Pielke put up a graph of CO2 emissions per $1,000 of GDP over time globally, which showed that there has been a steady improvement of efficiency. In 2006, emissions were 29.12 GtC, divided by $47.267 trillion of GDP, gives 0.62 tons of CO2 per $1,000 GDP. In 1980, that was above 0.90 tons of CO2 per $1,000 GDP.

Overall emissions track GDP, and the global economy has become more and more carbon intensive.

He looked at carbon dioxide per GDP (using purchasing power parity (PPP) for comparison between countries) for four different countries, Japan, Germany, U.S., and China (that's ordered from most to least efficient). Japan hasn't changed much over time, but is very carbon efficient (below 0.50 tons of CO2 per $1,000 GDP). Germany and the U.S. are about the same slightly above 0.50 tons of CO2 per $1,000 GDP, and both have improved similarly over time. China has gotten worse from 2002-2006 and is at about 0.75 tons of CO2 per $1,000 GDP.

He put up a slide of the EU-15 countries decarbonization rates pre- and post-Kyoto Protocol, and though there was a gap between them, the slopes appeared to be comparable. For the first ten years of Kyoto, then, he said, there's no evidence of any improvement in the background rate of decarbonization. The pre-Kyoto rate was from above 0.55 tons of CO2 per $1,000 GDP to about 0.50 tons of CO2 per $1,000 GDP. The post-Kyoto rates went from about 0.50 tons of CO2 per $1,000 GDP to below 0.45 tons of CO2 per $1,000 GDP.

At this point, Clark Miller (head of my program in Human and Social Dimensions of Science and Technology) pointed out that given Japan, there is no reason to assume that there should have been a continuing downward trend at all, but Pielke reiterated that since the slopes appeared to be the same there's no evidence that Kyoto made a difference.

The UK as a cautionary tale for U.S. policymakers

Pielke identified the emissions targets of the UK Climate Change Act of 2008:

Average annual reductions of 2.8% from 2007 to 2020, to reach 42% below 1990 levels by 2020.

Average annual reductions of 3.5% from 2020, to reach 80% below 1990 levels by 2050.

The former target of 42% below 1990 levels is contingent upon COP15 reaching an agreement this December; otherwise the unilateral target is 34% below 1990 levels.

Pielke showed a graph of the historical rate of decarbonization for the UK economy, and compared it to graphs of manufacturing output and manufacturing employment, observing that the success of decarbonization of the UK economy from 1980-2006 has been due primarily to offshoring of manufacturing, something that's not sustainable--once they reach zero, there's nowhere further down to go.

He then used France as a point of comparison, since it has the lowest CO2/GDP output of any developed country, due to its use of nuclear power for most of its energy--it's at 0.30 tons of CO2 per $1,000 GDP, and a lot of that is emissions from gasoline consumption for transportation.

It took France about 22 years, from 1984-2006, to get its emissions to that rate.

For the UK to hit its 2020 target, it needs to improve to France's rate in the next five years, by 2015. That means building 30 new nuclear power plants and reducing the equivalent coal and gas generation; Pielke said he would "go out on a limb" and say that this won't happen.

That will only get them 1/3 of the way to their 2020 goals.

The UK plan calls for putting 1.7 million electric cars on the road by 2020, which means doubling the current rate of auto sales and selling only electric cars.

For the entire world to reach France's level of efficiency by 2015 would require a couple of thousand nuclear power plants.

The U.S. situation and Waxman-Markey/Boxer-Kerry

The U.S., said Pielke, has had one of the highest rates of sustained decarbonization, from 1980-2006, going from over 1.00 tons of CO2 per $1,000 GDP to the current level of about 0.50 tons of CO2 per $1,000 GDP.

The Waxman-Markey target is an 80% reduction by 2050, not quite as radical as the UK.
The Boxer-Kerry target is a 17% reduction by 2020.

Pielke broke down the current U.S. energy supply by source in quadrillions of BTUs (quads), and pointed out that he got all of his data from the EIA and encouraged people to look it up for themselves:
Petroleum: 37.1
Natural gas: 23.8
Coal: 22.5
Renewable: 7.3
Nuclear: 8.5
Total energy was about 99.2 quads in 2007, of which 83.4 came from coal, natural gas, and petroleum.

Emissions by source:
Coal: 95 MMt CO2/quad
Natural gas: 55 MMt CO2/quad
Petroleum: 68 MMt CO2/quad
Multiply those by the amount of energy produced by each source and add them up:
95 * 22.5 + 55 * 23.8 + 68 * 37.1 = 5,969 MMt CO2
The actual total emissions were at about 5,979, so the above back-of-the-envelope calculation was pretty close.

In 2009, U.S. energy consumption will be about 108.6 quads, of which 21 quads will come from renewables and nuclear (40% growth from 2007), which leaves 87.2 quads from fossil fuels, a 4.6% increase from 2007.

If we substituted natural gas for all coal, then our 2020 emissions would be 5,300 MMt CO2, higher than the 2020 target and 12% below 2005, and would still lock us into a carbon intensive future.

In order to meet targets, we need to reduce coal consumption by 40%, or 11 quads, and replace that with renewables plus nuclear, plus an additional 3.8 quads of growth by 2020.

One quad equals about 15 nuclear plants, so 14.8 quads means building 222 new nuclear plants (on top of the 104 that are currently in the U.S.).

Or, alternatively, assuming 100 concentrated solar power installations * 30 MW peak per quad, 1,480 such installations for 14.8 quads, or one online every two days until 2020.

Or, assuming 37,500 * 80 kW peak wind turbines per quad, 555,000 such wind turbines for 14.8 quads, or one 150-turbine wind farm brought online daily until 2020.

To reach these targets with wind and solar would require increasing them by a factor of 37 by 2020; Obama has promised only a tripling.

Could we meet the targets by increasing efficiency of our energy consumption? We would have to reduce total energy consumption to 85.5 quads by 2020 (rather than 108.6), about equal to U.S. energy consumption in 1992, when the U.S. economy was 35% smaller than in 2007. That would be improving efficiency by about a third.

How fast can decarbonization occur? We don't know, because no one has really set out to intentionally do that. Historical rates have been 1-2% per year by developed countries; for short periods, some countries have exceeded 2% per year. Japan, from 1981-1986, improved by over 4% per year.

Pielke argued that these targets are not feasible targets in the U.S. or UK, and so policy makers are adding safety valves, offsets, and other mechanisms to allow some manipulation to give the appearance of success. Achieving 80% reduction in global emissions by 2050 requires > 5% decarbonization per year.

The problem, Pielke argued, is that the policy logic of targets and timetables is backwards, and we should focus on improving efficiency and decarbonization rather than emissions targets.

How things might be different

Pielke's suggested alternative strategy was presented in a slide something like this:
  • Focus policy on decarbonization of the economy (not simply emissions)
  • Efficiency gains (follow the Japanese model, “frontrunner program” by industry, look at best performer and set it as regulatory standard)
  • Expand carbon free energy (low carbon tax, other policies--subsidies, regulation, etc.)
  • Innovation-focused investments
  • To create ever advancing frontier of potential efficiency gains
  • Air capture backstop
  • Adaptation
The Japanese "frontrunner" program was where the government went industry by industry, identified the most efficient company in each industry, and set regulations to make that company the baseline standard for the other companies to meet.

Pielke argued that there should be a carbon tax of, say, $5/ton (or whatever is the "highest price politically possible"), with the collected funds (that would raise about $700B/year) used to promote innovation in energy efficiency.

If we find that we're stabilizing at 635 ppm, we may want to "brute force" some removal of carbon from the atmosphere (e.g., geoengineering).

In the Q&A session, Clark Miller questioned Pielke about the impossibility of replacing our energy infrastructure quickly--if it costs $2.61B for a 1400 MW nuclear plant, we'd need 65 of them (fewer than Pielke's number, he assumed smaller plants) at a cost of $260B. Since there is capital floating around causing asset bubbles in the trillions, and the energy industry is expected to become a $15T industry, surely there would be some drive to build them if they're going to become profitable. (Not to mention peak oil as a driver.) He agreed that it would take longer to construct these, but asked what the upshot would be if this was done by, say, 2075.

Also in the Q&A, Pielke pointed out that in a previous presentation of this talk, a philosophy professor had suggested that the population variable could be affected by handing out cyanide pills. (Or by promoting the growth of the Church of Euthanasia.) What I didn't mention above was that Pielke also briefly discussed improvements to human lifespan, and in his other talk (summary to come), he talked about how the IPCC's projections assume that we will not try to eradicate malaria...

ADDENDUM (November 7, 2009): I've seen estimates that U.S. carbon emissions will be about 6% lower in 2009 as a result of the recession, which amounts to considerable progress towards the Boxer-Kerry target. Projections of an economic recovery in 2010 strike me as overly optimistic; in my opinion there's a strong possibility that we haven't hit bottom yet and there's worse to come. Still, though, I think Pielke's probably right that energy consumption will go right back up again unless the recession becomes a depression and results in significant changes in consumption habits.

My summary of Pielke's lecture on climate change adaptation is here.

ADDENDUM (November 9, 2009): It should be noted that Roger Pielke, Jr. is a somewhat controversial figure in the climate change debate, and believed by many in the climate change blogosophere to be in the climate change skeptic camp, or to be biased towards them in terms of where he levels his criticisms. A post titled "Who Framed Roger Pielke?" from the Only In It For the Gold blog links to a number of opinions expressing these views.

UPDATE (February 5, 2010): A post titled "The Honest Joker" at Rabett Run critiques Pielke Jr.'s stance as an "honest broker" as a sham.

UPDATE (August 28, 2010): A talk by Pielke that appears to have some similarity to this one may be found here.

UPDATE (July 13, 2014): An updated version of the information in this talk is Ch. 3 of Pielke Jr.'s book, The Climate Fix (2010, Basic Books).

Charles Phoenix's retro slide show--in Phoenix


Tonight and tomorrow night at 8 p.m., Charles Phoenix will bring his Retro Slide Show Tour to the Phoenix Center for the Arts at 1202 N. 3rd St.

I've not seen his show before, but I've enjoyed his blog's slide-of-the-week feature and plan to go see this.

Here's the official description:

A laugh-out-loud funny celebration of '50s and '60s road trips, tourist traps, theme parks, world's fairs, car fashion fads, car culture and space age suburbia, will also include a selection of vintage images of the Valley of the Sun.

Click the above link for more details or to buy tickets.

Wednesday, November 04, 2009

Where is the academic literature on skepticism as a social movement?

Here's all I've been able to find so far, independent of self-descriptions from within the movement (and excluding history and philosophy of Pyrrhonism, Academic Skepticism, the Carvaka, the Enlightenment, British Empiricism, and lots of work on the development of the enterprise of science):
  • George Hansen, "CSICOP and the Skeptics: An Overview," The Journal of the American Society for Psychical Research vol. 86, no. 1, January 1992, pp. 19-63. I've not seen a more detailed history of contemporary skepticism elsewhere.
  • Stephanie A. Hall, "Folklore and the Rise of Moderation Among Organized Skeptics," New Directions in Folklore vol. 4, no. 1, March 2000.
  • David J. Hess, Science in the New Age: The Paranormal, Its Defenders and Debunkers, and American Culture, 1993, The University of Wisconsin Press.
I note that Paul Kurtz's The New Skepticism: Inquiry and Reliable Knowledge (1992, Prometheus Books) puts contemporary skepticism in the lineage of several of the other forms of philosophical skepticism I mentioned above, identifying his form of skepticism as a descendant of pragmatism in the C.S. Peirce/John Dewey/Sidney Hook tradition (and not the Richard Rorty style of pragmatism). But I think that says more about Kurtz than about the skeptical movement, which also draws upon other epistemological traditions and probably doesn't really have a sophisticated epistemological framework to call its own.

There's a lot of literature on parallel social movements of various sorts, including much about advocates of some of the subject matter that skeptics criticize, and some of that touches upon skeptics. For example:
  • Harry Collins and Trevor Pinch, "The Construction of the Paranormal: Nothing Unscientific is Happening," in Roy Wallis, editor, On the Margins of Science: The Social Construction of Rejected Knowledge, 1979, University of Keele Press, pp. 237-270.
  • Harry Collins and Trevor Pinch, Frames of Meaning: The Social Construction of Extraordinary Science, 1982, Taylor & Francis.
  • Ronald L. Numbers, The Creationists: From Scientific Creationism to Intelligent Design, 2nd edition, 2006, Harvard University Press.
  • Christopher P. Toumey, God's Own Scientists: Creationists in a Secular World, 1994, Rutgers University Press.
The Toumey book doesn't really have anything about skeptics, but is an anthropological study of creationists in the United States which describes the connection between "creationism as a national movement" and "creationism as a local experience" that seems intriguingly similar to the skeptical movement, especially in light of the fact (as I mentioned in my previous post) that national skeptical organizations are independent of established institutions of science that provide the key literature of the movement and at least implicitly assume that the average layman can develop the ability to discern truth from falsehood, at least within a particular domain, from that literature.

In some ways, the skeptical movement also resembles a sort of layman's version of the activist element in the field of science and technology studies, based on positivist views of science that are the "vulgar skepticism" dismissed in this article:
I think if contemporary skepticism wants to achieve academic respectability, it will need to develop a more sophisticated view of science that comes to terms with post-Popper philosophy of science and post-Merton sociology of science; my recommendation for skeptics who are interested in that subject is to read, as a start:
  • Philip Kitcher, The Advancement of Science: Science Without Legend, Objectivity Without Illusions, 1995, Oxford University Press.
There's an enormous relevant literature on those topics, an interesting broad overview is:
  • R.C. Olby, G.N. Cantor, J.R.R. Christie, and M.J.S. Hodge, Companion to the History of Modern Science, 1990, Routledge.
I welcome any new revelations about sources of relevance that I've missed, particularly if there is other academic work specifically addressing the history, philosophy, sociology, and anthropology of the contemporary skeptical movement--three sources ain't much.

UPDATE (September 27, 2014): Some additional works I recommend for skeptics:

  • Harry Collins, Are We All Scientific Experts Now?, 2014, Polity Press.  A very brief and quick overview of science studies with respect to expertise.
  • Massimo Pigliucci, Nonsense On Stilts: How to Tell Science from Bunk, 2010, University of Chicago Press. A good corrective to the overuse of Popper, easy read.
  • Massimo Pigliucci and Maarten Boudry, Philosophy of Pseudoscience: Reconsidering the Demarcation Problem, 2013, University of Chicago Press. Good collection of essays reopening the debate many thought closed by Larry Laudan on whether there can be philosophical criteria for distinguishing the boundary between science and pseudoscience.

What are the goals of Skepticism 2.0?

Yesterday I listened to D.J. Grothe's interview with Ben Radford on the Point of Inquiry podcast about the latest issue of the Skeptical Inquirer (November/December 2009) about "Skepticism 2.0," the bottom-up grassroots expansion of the skeptical movement through Internet communications tools like blogs, podcasts, online videos and forums, and the real-world activities that have become possible through them, like meetups and SkeptiCamps.

Near the end of the podcast, D.J. asked Ben what he thought would be the results of Skepticism 2.0 in five years time. He said (1) more skeptics and (2) more cooperative projects between the three major U.S. skeptical groups, the Committee for Skeptical Inquiry, the James Randi Educational Foundation, and the Skeptics Society.

That struck me as a rather disappointingly modest set of goals, as well as rather "old school" skepticism thinking, and insular. Surely we can come up with ideas for something more exciting, interesting, and useful than merely the self-perpetuation and growth of the skeptical movement and cooperation among the traditional top-down skeptical organizations over the next five years.

A few thoughts that came to my mind:
  • If skeptics want to promote public understanding of science and critical thinking, why not partnerships with other organizations that also have those purposes? The National Academies of Science, the National Center for Education, teacher's groups and school groups at a local level?
  • If skeptics want to promote the activity of science, why not look at ways to help motivate students to enter science as a career, and support them in doing so? I've previously suggested to Phil Plait that JREF might partly model itself after the Institute for Humane Studies, an organization which provides support for undergraduate and graduate students who favor classical liberal political ideals, in order to help them achieve success in careers of thought leadership, including academics, journalists, filmmakers, public policy wonks, and so on. In order for skepticism and critical thinking to have a significant impact, it's not necessary that everyone become a skeptic, only that a sufficient number of people in the right places engage in and encourage critical thinking.
  • If skeptics want to see more diversity in the skeptical movement, why not look at ways to reach out to other communities? The podcast did mention the SkepTrack at Dragon*Con, which is one of the most innovative ideas for outreach for skeptical ideas since the founding of CSICOP in 1976.
  • If skeptics want to act as a form of consumer protection against fraud and deception, why not try to find ways to interact with regulators, investigators, politicians, and the media to get fraudulent products and services off the market? The UK complaints against chiropractors making false claims on their websites as a response to the British Chiropractic Association libel lawsuit against Simon Singh, or the Australian complaint against bogus claims by anti-vaccinationists (though see my comment on that blog post for some reservations) might suggest some ideas.
It seems to me that the skeptical movement should be concerned about more than just increasing its own numbers and getting the existing national groups to work together. I think that Skepticism 2.0 has and will continue to force the existing groups to cooperate with each other and with the grassroots movement if they don't want to become obsolete and irrelevant. And at this point growth is, at least for the near-term, a foregone conclusion. But in order to continue to grow and thrive, there should be some goals that have something to do with being useful and making the world a better place, by which the skeptical movement can measure its effectiveness and success.

I'm sure readers of this blog have further suggestions. What else?

Addendum:

By the way, with regard to my first suggestion, here's a question that may provide some motivation and food for thought: Why do the Parapsychological Association and the National Center for Complementary and Alternative Medicine have better and more formal ties to official institutions of science than any skeptical organization? The PA is a member of the AAAS, and NCCAM is an agency within the National Institutes of Health. The main difference between those organization and skeptical organizations is that they actually do and publish peer-reviewed scientific research.

Tuesday, November 03, 2009

More Scientology exposure from the St. Pete Times

The St. Petersburg Times has published another three-part exposé on the Church of Scientology based on interviews with former high-level members. (The first three-part series from June is discussed here; I missed the second three-part series from August about new defectors; all three series may be found on the SP Times website here.)

Part 1 (October 31): "Chased by their church: When you leave Scientology, they try to bring you back"

An overview of this new, third series of exposures based on information from former high-ranking members of the Church of Scientology such as Mark "Marty" Rathbun and Mike Rinder.

The story of how the church commands and controls its staff is told by the pursuers and the pursued, by those who sent spies and those spied upon, by those who interrogated and those who rode the hot seat. In addition to Rathbun, they include:

• Mike Rinder, who for 25 years oversaw the church's Office of Special Affairs, which handled intelligence, legal and public affairs matters. Rinder and Rathbun said they had private investigators spy on perceived or potential enemies.

They say they had an operative infiltrate a group of five former Scientology staffers that included the Gillham sisters, Terri and Janis, two of the original four "messengers" who delivered Hubbard's communications. They and other disaffected Scientologists said they were spied on for almost a decade.

• Gary Morehead, the security chief for seven years at the church's international base in the desert east of Los Angeles. He said he helped develop the procedure the church followed to chase and return those who ran, and he brought back at least 75 of them. "I lost count there for awhile.''

Staffers signed a waiver when they came to work at the base that allowed their mail to be opened, Morehead said. His department opened all of it, including credit card statements and other information that was used to help track runaways.

• Don Jason, for seven years the second-ranking officer at Scientology's spiritual mecca in Clearwater, supervised a staff of 350. He said that after he ran, he turned himself in and ended up locked in his cabin on the church cruise ship, the Freewinds. He said he was held against his will.

Part 2 (November 2): "Scientology: What happened in Vegas"

How ex-members Terri and Janis Gillham, who had been Sea Org "messengers" for L. Ron Hubbard and whose legal guardian had been Hubbard's wife Mary Sue, had their mortgage business in Las Vegas infiltrated by spies working for the Church of Scientology to keep tabs on what they were up to. Mark Fisher, Scientology head David Miscavige's aide de camp for seven years, was spied on by the man he thought was his best friend.

Part 3 (November 3): "Man overboard: To leave Scientology, Don Jason had to jump off a ship"

After leaving the Church once and returning, Don Jason was put aboard the Freewinds, a Scientology ship, and monitored constantly. He managed to get off the ship in the Bahamas by effectively zip-lining down a cable with a home-made device, and getting on a plane to Milwaukee by way of Tampa and Atlanta. Someone from the Church booked the seat next to his, and Rathbun (still in the Church at the time) met him at Tampa, and then bought a ticket on his flight, to try to talk him into returning.

Sunday, November 01, 2009

More apparent plagiarism from Ian Plimer

Eli Rabett and Pieter Tans identified some errors in Ian Plimer's book's claim of selective data reporting from Mauna Loa measurements of atmospheric carbon, which Tim Lambert at the Deltoid ScienceBlog tracks to climate change skeptic Ferdinand Engelbeen. But Plimer doesn't cite Engelbeen, perhaps because Engelbeen also refutes the argument Plimer is trying to make.

This is not the first time Plimer has copied without quoting or citing sources--multiple instances in his book Telling Lies for God have previously been identified by Jeffrey Shallit and me.

(Previously on Plimer at this blog.)

Friday, October 30, 2009

Maricopa County Notices of Trustee's Sales for October 2009

I haven't posted one of these things in a while, so I figured it was about time.

The big peak was in March, with a total of 10,725 that month. October's total was 6,618.

Robert Balling on climate change

This afternoon I went to hear ASU Prof. Robert Balling, former head of ASU's Office of Climatology and current head of the Geographic Information Systems program, talk about climate change in a talk that was advertised as "Global Warming Became Climate Change: And the Story Continues," though I didn't notice if he had a title slide for his presentation.

He began his talk by saying that in 1957, measurements of CO2 began to be made at Mauna Loa (by Charles David Keeling), which established that CO2 is increasing in our atmosphere, largely because of human activity--from fossil fuel emissions. It's approaching 390 parts per million (ppm). Last weekend, the "A" on A Mountain near the university was painted green by a bunch of people wearing shirts that say "350" on them, because they want atmospheric CO2 to be stabilized at 350 ppm, which was the level in 1990, which is the benchmark year for the Kyoto Protocol.

But this isn't remotely feasible, he said, citing the Intergovernmental Panel on Climate Change (IPCC). Even the most optimistic scenario in the IPCC Report has atmospheric carbon continuing to rise until 2100, hitting about 600 ppm. If we reduced emissions to 0, the best case would end up with stabilization at around 450 ppm. Our lifetime will see increasing CO2 levels, no matter what we do. (In other words, the Kyoto benchmark sets a standard for emissions levels to return to, not for a level of atmospheric carbon to return to.)

If you look at the earth's history on a longer scale, atmospheric carbon has been much higher in the past--it was at about 2500 ppm during the dinosaurs. During the last 600,000 years, however, it has been much lower, and fell below 200 ppm in the last glacial period. This, Balling said, shows what he would identify as a dangerous level of CO2--falling below 160 ppm, which causes plants to die.

There are other greenhouse gases besides CO2 that have an effect, such as methane and NO2, that humans are producing, he said.

At this point, he said the greenhouse effect is real--CO2 doubling causes warming--and this has been known for 120 years and "nobody is denying that."

There are climate models, which he said he has great respect for--it's basic physics plus fantastic computing and applied math. Climate modelers, he said, are their own worst critics. Problems for climate models include clouds, water vapor, rain, and the ocean, but lots of things are modeled correctly and the results are generally pretty good. Clouds, he said, are the biggest area of debate. The IPCC models say that clouds amplify warming, but satellite-based measurements suggest that clouds dampen (but don't eliminate) warming. Thus, he concluded, IPCC may be predicting more warming than will actually occur.

He next discussed empirical support for warming, and pointed out that the official plot of global temperatures has no error bars, and the numbers reported come from sensors that don't cover the entire world. How you come up with a global average can be done in different ways, and the different methods produce different results. You can take grid cells, average by latitudinal bands, get two hemispheric averages, and average them together. You can just average all of the data we have. He said that Roger Pielke Sr. questions the use of average temperatures and suggests looking at afternoon high temperatures. Looking at the older end of the chart, he asked, "where were the sensors in 1900? Why no error bars?"

He asked, "Is the earth warming," and said "right now the earth is not warming. I expect it to keep going up, but over the last decade there's been essentially none." He pointed to a recent article in Science magazine, "What happened to global warming?" Many are writing about this, he said, and there could be "1001 different things including sun and oceanic processes." (I don't believe this is correct unless you measure from 1998, which was an El Nino year. Most of the top 10 warmest years in history are post-1998.)

"Scientists are questioning the data," he said, showing photos from Anthony Watts' blog of poorly situated weather stations. "The albedo of the shelter in my backyard has changed as it has decayed," and caused it to report warmer temperatures. He said that people are having a field day taking photos of poor official sites. (What Balling didn't say is that what's important in the data is not absolute temperature but the temperature trends, and the good sites and bad sites both show the same trends.)

He pointed out that there are corrections to the temperature based on time of measurement, urban heat islands, instruments used, etc. If you look at the raw data for the U.S. from 1979-2000, you see 0.25 degrees Celsius of warming. Sonde data shows 0.24 degrees, MSU's measurements show 0.28, IPCC shows 0.28, and FILNET shows 0.33. He suggested that these corrections on the official data may be inflating the temperature (again, see my previous comment on trends vs. absolute temperature). Sky Harbor Airport produces the official temperature results for Phoenix, maximizing urban heat island effect. Many of the city records are from the worst sites, and he suggests looking at rural temperatures might give a different result.

Another factor is stratospheric turbidity from volcanic eruptions, and he showed a plot of orbital temperatures from satellites vs. stratospheric turbidity. He said that volcanism accounts for about 30% of the trend variability.

The big player in the game, he said, is the sun. Solar irradiance measures showed a significant decline in solar output in 1980, but earth temperature continued upward--he said he mentioned this because he thought it would be used as an objection. In response, he said that "the sun doesn't increase or decrease output over the entire spectrum and there are interactions with stratospheric clouds." He said that there are astrophysicists who argue that this is the major cause of global warming. In the Q&A, he said that there's one group that thinks cosmic ray flux is the major factor in global temperature because it stimulates cloud formation, while another group says that cosmic ray flux is little more than a trivial effect. He also said that this debate takes place in journals that "I find very difficult to read."

There are other confounding variables like El Nino and La Nina, but he said there has "definitely been warming over the last three decades with a discernable human contribution."

He put up a graph of the Vostok Reconstruction of temperature based on ice core data, on a chart labeled from -10 to +4 degree temperature changes in Celsius, which were mostly in the negative direction, and said we've seen periodic rapid changes up and down without any human contribution.

He talked about the IPCC "hockey stick" graph from 2001, which led to a huge debate about the possibility of bad statistical methods that guaranteed the hockey stick shape. He observed that 1000 years ago it was as warm or warmer than today, the Medieval Warming Period, which was missing from the "hockeystick" graph. There was also a "Little Ice Age," also missing from the graph. He said the IPCC has backed away from the hockey stick and its most recent report includes clear Medieval Warming Period and Little Ice Age periods in its graphs.

He showed a photo of a microwave sounding unit for temperature measurement, and the polar satellite record from 1978 to present, which showed a big peak in 1998 from El Nino. He said he wrote his book saying that there was no warming in 1992, when it was true. After 1998, the temperature came back down quickly, and, he said, the satellite record, like the Science article, hasn't seen warming. He then corrected himself to say, "well, some warming, but not consistent with the IPCC models."

He said there has been high latitude warming, and the difference between winter and summer warming has supported the numeric models. "But a problem has evolved, which is the most powerful argument of the skeptics." The models predict that there should be warming at the surface, which increases at higher altitude as you go up into the atmosphere. "There should be very strong warming in the middle of the atmosphere, but it's not in the data." This is the main anti-global warming argument of Joanne Nova's "The Skeptics Handbook" that has been distributed to churches throughout the U.S. by the Heartland Institute (an organization supported by the oil industry that has sometimes gotten into trouble due to its carelessness).

At this point, Balling started asking various questions and answering them by quoting from the IPCC reports:

More hurricanes? The IPCC doesn't say this. He cited the 1990, 1996, 2001 (executive summary, p. 5), and 2007 (p. 6) reports, all of which say that there's no indication or no clear trend of increase or decrease of frequency or intensity of hurricanes or tropical cyclones as a result of warming.

The southwestern United States may become drier? Here, he answered affirmatively, pointing out that an ASU professor has an article that just came out in Science on this topic. Atmospheric circulation is decreasing, and soil moisture measures show the southwest is becoming drier. On this, he said, there's "evidence everywhere," and the Colorado River basin in particular is being hit hard. And this is consistent with IPCC predictions. He cited Roy Spencer to say that "extraordinary prediction require extraordinary evidence." (This actually comes from Carl Sagan, who said "extraordinary claims require extraordinary evidence" in Cosmos.)

Frequency of tornadoes? It's down, not up, and IPCC 2007 p. 308 says there is no evidence to draw general conclusions.

Ice caps are melting? Balling said Arctic yes, Antarctic, no. He cited the IPCC 2007 p. 6 regarding Antarctic sea ice extent indicating a lack of warming, and p. 13 that it's too cold for widespread surface melting. He contrasted this with a slide of a homeless penguin used to argue for action on global warming. The Arctic ice cap "has its problems," he said, and its extent has declined though has "rebounded a bit" recently. (In the Q&A, he said that half of the loss in the last six years has been recovered.) He said that experts in sea ice extent identify relative temperature, ocean currents, and wind as more important than temperature--"it's not a thermometer of the planet." In the past, northern sea ice has dropped as southern sea ice has increased, with the overall global extent of sea ice relatively unchanged. In the Q&A, he made it clear that he wasn't saying that temperature wasn't a factor, but that global temperature is definitely not a factor and temperature is less important than the other factors he identified.

Sea levels changing? He said there's no doubt about this, but the important question is whether the rise is accelerating. He cited Church et al. J. Climate 2004, p. 2624 for a claim of "no detectible secular increase" in rate of sea level rise, but noted that another article this week says that there is. IPCC 2007, p. 5 says that it is unclear if increasing is a longer-term trend. The average has been 1.8 mm/year, but with variable rates of change. IPCC 2007, p. 9 says that 125,000 years ago sea levels were likely 4-6m higher than in the 20th century, due to retreat of polar ice.

He said that ice melting on Kilimanjaro has been a "poster child" for global warming, but that this sharp decline "started its retreat over 100 years ago," at the end of the Little Ice Age (1600-1850), and is related to deforestation and ocean patterns in the Indian Ocean rather than global warming. It's not in an area where significant warming is expected by climate models, and local temperatures don't show it.

He then talked about a few factors that cause temperature forcing in a negative direction (i.e., cooling)--SO2, which makes clouds last longer, increased dust, and ozone thinning. He said that his entry into the IPCC was his work at the U.S./Mexico border where he found that overgrazed land on the Mexican side caused warming, and it was much cooler on the U.S. side of the border. The dust, however, had a global cooling effect.

The 2001 IPCC report lists global radiative forcings in the negative direction: stratospheric ozone, sulfate, biomass burning, and mineral aerosols. In the positive direction include CO2 and solar irradiance. The 2007 report adds many more, including contrails from aircraft. A chart from the report lists the level of scientific understanding for each factor, and he observed that it's "low" for solar irradiance.

He cited a quote from James Hansen (Proc. Nat. Acad. Sci. p. 12,753, 1998) saying that we can't predict the long term, and said he agrees.

He observed that the Pew Foundation poll for Sep. 30-Oct. 4 asked Americans if they think there is evidence of global warming being caused by humans and only 36% said yes--he said he's one of those 36%.

He concluded by observing that if you look at the difference between doing nothing at all, or stabilizing at 1990 levels in 1990, that only produces changes of a few hundredths of a degree of temperature in 2050--so no matter what we do, "we won't live long enough to see any difference."

In the Q&A session, Prof. Billie Turner said that "our academy is about to issue a statement that we are 97% sure that we will not be at a 1-degree Celsius increase but a 2-degree Celsius increase by 2050" (or about double what Balling's final slide showed). He objected that Balling's talk began with the "lunatic green fringe" and contrasted it with the IPCC, which he said would be like him beginning a talk with Dick Cheney's views before giving his own. He said this may be an effective format, but it "gives a slant on the problem that isn't real in the expert community." Turner also pointed out that on the subject of mitigation, if you are going to make a calculation in economic terms you have to use a discount rate. The Stern Review used a high discount rate, and concluded that it is worth spending a lot of money now on mitigation; William Nordhaus and Partha Dasgupta, on the other hand, used a low discount rate and concluded that it's not worth spending money on now.

Balling said that he gets email from "lefties" that ask him to "please keep criticizing" because "this [global warming] is just an excuse to keep the developing world from catching up." In conversation with a small group afterward, Balling made it clear that he thinks people shouldn't be listening to Limbaugh and Hannity on climate change, and in answer to my question about what sources the educated layman should read and rely upon, he answered unequivocally "the IPCC," at least the scientific portions authored by scientists. He had some criticisms for the way that the technical summaries are negotiated by politicians, however, and said that S. Fred Singer has made hay out of comparing the summaries to the actual science sections and pointing out contradictions. He also said that Richard Lindzen at MIT, who he said may be the best climate scientist around, thinks the whole IPCC process is flawed, and that John Christy, lead author of the 2001 IPCC report, thinks the IPCC process should allow an "alternative views" statement by qualified scientists who disagree.

In a very brief discussion afterward with the climate modeling grad student in my climate change class, he said that the biggest weakness of the talk was that Balling didn't talk about ocean temperatures, being measured by the Argo project of NASA's Jet Propulsion Laboratory. These measures had shown some recent cooling (but a long-term warming trend), but after discovering an error, Joshua Willis found that warming has continued.

Balling supports the science, but he still leans to minimizing the negative effects, and uses some apparently bad arguments to do so. His position clearly advocates a "wait and see" approach, and argues that we needn't be in a hurry to mitigate since nothing we do will have any effect in our lifetimes--but it could have an enormous effect on what is required for mitigation and adaptation for future generations.

Thursday, October 29, 2009

State Press defends Ravi Zacharias

ASU's State Press columnist Catherine Smith authored an op-ed piece promoting last night's appearance of Christian apologist Ravi Zacharias. This was at least her second such op-ed; a prior one was published on September 17.

My letter to the editor, below, didn't get published, but another critic's letter did get published.

Here's mine:
Catherine Smith quotes Ravi Zacharias as stating that "irreligion and atheism have killed infinitely more than all religious wars of any kind cumulatively put together." This statement not only demonstrates Zacharias' innumeracy, it shows that he continues to make the mistake of attributing killing in the name of political ideologies like Stalinism and communism to atheism. I agree that Stalin, Mao, and Pol Pot killed more than religious wars, but it wasn't their atheism that caused that killing. Those killed by religious wars, the Inquisition, and witch trials, however, were killed in the name of religion. Out of fairness, there were no doubt political issues involved in many wars over religion as well, but if you take claims of religiously motivated killing at face value, the death tolls for those killed in the name of religion far exceed the death tolls for those killed in the name of irreligion.

Zacharias has a history of attacking atheism with misrepresentations in his books, as documented in Jeff Lowder's "An Emotional Tirade Against Atheism" and Doug Krueger's "That Colossal Wreck," both of which may be found on the Internet as part of the Secular Web (http://www.infidels.org/).
I first heard of Zacharias back around 1991, when I sat behind someone on an airplane flight who was reading his book (reviewed by Krueger, linked above), A Shattered Visage. The parts I read were truly awful, about the quality of M. Scott Huse arguments against evolution (a step below Kent Hovind and Ken Ham). I didn't bother to attend, but would be interested in hearing any reports of how it went.

UPDATE (November 24, 2017): Steve Baughman has published an exposure of Zacharias' claims to have credentials he does not possess, and to have had academic appointments that did not exist.

UPDATE (September 26, 2020): Ravi Zacharias died of cancer earlier this year, but not before being caught in an online relationship scandal.

UPDATE (February 11, 2021): Ravi Zacharias International Ministries has publicly released a report on an investigation into abuse charges against Ravi Zacharias, and it found a significant pattern of predatory sexual abuse and a rape allegation.

The woman googled “Ravi Zacharias sex scandal” and found the blog RaviWatch, run by Steve Baughman, an atheist who had been tracking and reporting on Zacharias’s “fishy claims” since 2015. Baughman blogged on Zacharias’s false statements about academic credentials, the sexting allegations, and the subsequent lawsuit. When the woman read about what happened to Lori Anne Thompson, she recognized what had happened to that woman was what had happened to her.

As far as she could tell, this atheist blogger was the only one who cared that Zacharias had sexually abused people and gotten away with it. She reached out to Baughman and then eventually spoke to Christianity Today about Zacharias’s spas, the women who worked there, and the abuse that happened behind closed doors.

Wednesday, October 28, 2009

Teaching the Bible in public schools

The following is a letter to the editor of Arizona State University's State Press that the paper didn't print. It was written in response to an editorial by Will Munsil, son of Len Munsil, who was editor of the State Press when I was an undergraduate in the 1980s. Len Munsil is an extremely conservative Republican, failed Republican candidate for Governor in 2006, and founder of the Center for Arizona Policy, Arizona's version of the American Family Affiliation. His daughter, Leigh Munsil, is the State Press's current editor-in-chief. When Munsil Sr. edited the school paper, he sometimes refused to print my letters to the editor for shaky reasons.

The letter below was written in response to Will Munsil's "Putting the Bible back in public schools," which was published on October 14.
I disagree with Will Munsil's assertion that the Bible is the foundation of American political thought. On the contrary, the American form of government was rooted in the work of enlightenment philosophers such as Locke, Montesquieu, and Rousseau. The U.S. Constitution's form of government has more resemblance to Caribbean pirate codes than to the Ten Commandments.

That said, however, I agree with Munsil that knowledge of the Bible is worthwhile and should be taught in public schools for the purpose of cultural literacy, so long as it is done without endorsing Christianity or Judaism. The Bible Literacy Project's curriculum might be one way to do it. One way not to do it is to use the National Council on Bible Curriculum in Public Schools' curriculum--it takes a sectarian perspective, is full of errors, and has failed legal challenges in Texas and Florida for being unconstitutional.
I suspect this letter wasn't excluded by reason of content, but because they had already printed a couple of letters critical of Will Munsil's op-ed by the time I submitted this on October 16. Perhaps I should have mentioned that I'm an atheist, which makes the extent of my agreement with Munsil more interesting. Of course, my view is contrary to Munsil's in that I think Bible literacy is likely to decrease, rather than increase, religious belief. But it wouldn't surprise me if the NCBCPS curriculum is the one that Will Munsil had in mind.

I should point out that I think it should probably be taught as part of a world religions class that covers more than just Christianity--kids should not only get information about the Bible that they won't get in Sunday School, they should be informed about other religions, as well as the fact that history has been full of doubters of religion, as documented in Jennifer Michael Hecht's excellent Doubt: A History.

You can find out more about the NCBCPS curriculum that failed legal challenge in Texas here.

Munsil cited Stephen Prothero, whose op-ed piece, "We live in a land of biblical idiots," I wrote about at the Secular Web in early 2007.