Sunday, November 08, 2009

Philosophy Bites podcast

I've been listening to past episodes of the Philosophy Bites podcast, and I highly recommend it--they are short (about 15 minute) discussions with prominent philosophers about specific philosophical topics and questions. I've found them to be consistently of high quality and interesting, even in the one case where I think the philosophical argument was complete nonsense (Robert Rowland Smith on Derrida on forgiveness). Even there, the interviewers asked the right questions.

I particularly have enjoyed listening to topics that are outside the areas of philosophy I've studied, like Alain de Botton on the aesthetics of architecture. Other particularly good ones have been Hugh Mellor on time, David Papineau on physicalism, A.C. Grayling on Descartes' Meditations, and Peter Millican on the significance of Hume. I've still got a bunch more past episodes to listen to; I'm going to be somewhat disappointed when I catch up.

Saturday, November 07, 2009

Robert B. Laughlin on "The Crime of Reason"

The 2009 Hogan and Hartson Jurimetrics Lecture in honor of Lee Loevinger was given on the afternoon of November 5 at Arizona State University's Sandra Day O'Connor School of Law by Robert B. Laughlin. Laughlin, the Ann T. and Robert M. Bass Professor of Physics at Stanford University and winner of the 1998 Nobel Prize in Physics (along with Horst L. Stormer and Daniel C. Tsui), spoke about his recent book, The Crime of Reason.

He began with a one-sentence summary of his talk: "A consequence of entering the information age is probably that we're going to lose a human right that we all thought we had but never did ..." The sentence went on but I couldn't keep up with him in my notes to get it verbatim, and I am not sure I could identify precisely what his thesis was after hearing the entire talk and Q&A session. The main gist, though, was that he thinks that a consequence of allowing manufacturing to go away and being a society based on information is that "Knowledge is dear, therefore there has to be less of it--we must prevent others from knowing what we know, or you can't make a living from it." And, he said, "People who learn on their own are terrorists and thieves," which I think was intentional hyperbole. I think his talk was loaded with overgeneralizations, some of which he retracted or qualified during the Q&A.

It certainly doesn't follow from knowledge being valuable that there must be less of it. Unlike currency, knowledge isn't a fungible commodity, so different bits of knowledge have different value to different people. There are also different kinds of knowledge--know-how vs. knowledge that, and making the latter freely available doesn't necessarily degrade the value of the former, which is why it's possible to have a business model that gives away software for free but makes money from consulting services. Further, the more knowledge there is, the more valuable it is to know where to find the particular bits of knowledge that are useful for a given purpose, and the less it is possible for a single person to be an expert across many domains. An increasing amount of knowledge means there's increasing value in various kinds of specializations, and more opportunities for individuals to develop forms of expertise in niches that aren't already full of experts.

Laughlin said that he is talking about "the human rights issue of the 21st century," that "learnign some things on your own is stealing from people. What we think of as our rights are in conflict with the law, just as slavery is in conflict with human rights." He said that Jefferson was conflicted on this very issue, sayng on the one hand that "knowledge is like fire--divinely designed to be copyable like a lit taper--I can light yours with mine, which in no way diminishes my own." This is the non-rival quality of information, that one person copying information from another doesn't deprive the other of their use of it, though that certainly may have an impact on the commercial market for the first person to sell their information.

"On the other hand," said Laughlin, "economics involves gambling. [Jefferson] favored legalized gambling. Making a living involves bluff and not sharing knowledge." He said that our intellectual property laws derive from English laws that people on the continent "thought ... were outrageous--charging people to know things."

He put up a photo of a fortune from a fortune cookie, that said "The only good is knowledge, and the only evil ignorance." He said this is what you might tell kids in school to get them to study, but there's something not right about it. He then put up a drawing of Dr. Frankenstein and his monster (Laughlin drew most of the slides himself). He said, we're all familiar with the Frankenstein myth. "The problem with open knowledge is that some of it is dangerous. In the U.S. some of it is off-limits, you can't use it in business or even talk about it. It's not what you do with it that's exclusive, but that you have it at all."

His example was atomic bomb secrets and the Atomic Energy Act of 1954, which makes it a federal felony to reveal "nuclear data" to the public, which has been defined very broadly in the courts. It includes numbers and principles of physics.

Laughlin returned to his fortune cookie example, and said there's another problem. He put up a drawing of a poker game. "If I peeked at one guy's cards and told everyone else, the poker game would stop. It involves bluffing, and open access to knowledge stops the game." He suggested that this is what happened last year with the world financial sector--that the "poker game in Wall Street stopped, everyone got afraid to bet, and the government handled it by giving out more chips and saying keep playing, which succeeded." I agree that this was a case where knowledge--specifically knowledge of the growing amounts of "toxic waste" in major world banks--caused things to freeze up, it wasn't the knowledge that was the ultimate cause, it was the fact that banks engaged in incredibly risky behavior that they shouldn't have. More knowledge earlier--and better oversight and regulation--could have prevented the problem.

Laughlin said "Economics is about bluff and secrecy, and open knowledge breaks it." I don't think I agree--what makes markets function is that price serves as a public signal about knowledge. There's always going to be local knowledge that isn't shared, not necessarily because of bluff and secrecy, but simply due to the limits of human capacities and the dynamics of social transactions. While trading on private knowledge can result in huge profits, trading the private knowledge itself can be classified as insider trading and is illegal. (Though perhaps it shouldn't be, since insider trading has the potential for making price signals more accurate more quickly to the public.)

Laughlin showed a painting of the death of Socrates (by Jacques-Louis David, not Laughlin this time), and said that in high school, you study Plato, Aristotle, and Descartes, and learn that knowledge is good. But, "as you get older, you learn there's a class system in knowledge." Plato etc. is classified as good, but working class technical knowledge, like how to build a motor, is not, he claimed. He went on to say, "If you think about it, that's exactly backwards." I'm not sure anyone is ever taught that technical knowledge is not valuable, especially these days, where computer skills seem to be nearly ubiquitous--and I disagree with both extremes. From my personal experience, I think some of my abstract thinking skills that I learned from studying philosophy have been among the most valuable skills I've used in both industry and academia, relevant to both theoretical and practical applications.

Laughlin said that "engines are complicated, and those who would teach you about it don't want to be clear about it. It's sequestered by those who own it, because it's valuable. The stuff we give away in schools isn't valuable, that's why we give it away." In the Q&A, a questioner observed that he can easily obtain all sorts of detailed information about how engines work, and that what makes it difficult to understand is the quantity and detail. Laughlin responded that sometimes the best way to hide things is to put them in plain sight (the Poe "purloined letter" point), as needles in a haystack. But I think that's a rather pat answer to something that is contradictory to his claim--the information really is freely available and easy to find, but the limiting factor is that it takes time to learn the relevant parts to have a full understanding. The limit isn't the availability of the knowledge or that some of it is somehow hidden. I'd also challenge his claim that the knowledge provided in schools is "given away." It's still being paid for, even if it's free to the student, and much of what's being paid for is the know-how of the educator, not just the knowledge-that of the specific facts, as well as special kinds of knowledge-that--the broader frameworks into which individual facts fit.

Laughlin went on to say, "You're going to have to pay to know the valuable information. Technical knowledge will disappear and become unavailable. The stuff you need to make a living is going away." He gave as examples defense-related technologies, computers, and genetics. He said that "people in the university sector are facing more and more intense moral criticism" for sharing information. "How life works--would we want that information to get out? We might want to burn those books. The 20th century was the age of physics, [some of which was] so dangerous we burned the books. It's not in the public domain. The 21st century is the age of biology. We're in the end game of the same thing. In genetics--e.g., how disease organisms work. The genetic structure of Ebola or polio." Here, Laughlin seems to be just wrong. The gene sequences of Ebola and polio have apparently been published (Sanchez, A., et al. (1993) "Sequence analysis of the Ebola virus genome: organization, genetic elements and comparison with the genome of Marburg virus," Virus Research 29, 215-240 and Stanway, G., et al. (1983) "The nucleotide sequence of poliovirus type 3 leon 12 a1b: comparison with poliovirus type 1," Nucleic Acids Res. 11(16), 5629-5643). (I don't claim to be knowledgeable about viruses, in the former case I am relying on the statement that "Sanchez et al (1993) has published the sequence of the complete genome of Ebola virus" from John Crowley and Ted Crusberg, "Ebola and Marburg Virus: Genomic Structure, Comparative and Molecular Biology."; in the latter case it may not be publication of the complete genome but is at least part.)

Laughlin talked about the famous issue of The Progressive magazine which featured an article by Howard Moreland titled "How H-Bombs Work." He showed the cover of the magazine, which read, "The H-Bomb Secret--How we got it--why we're telling it." Laughlin said that the DoJ enjoined the journal from publishing the article and took the issue into secret hearings. The argument was that it was a threat to national security and a violation of the Atomic Energy Act. The judge said that the rule against prior restraint doesn't apply because this is so dangerous that "no jurist in their right mind would put free speech above safety." Laughlin said, "Most people think the Bill of Rights protects you, but this case shows that it doesn't." After the judge forbid publication, it was leaked to a couple of "newspapers on the west coast," after which the DoJ dropped the case and the article was published. According to Laughlin, this was strategy, that he suspects they didn't prosecute the case because the outcome would have been to find the AEA unconstitutional. By dropping the case it kept the AEA as a potential weapon in future cases. He said there have only been two cases of the criminal provisions of the AEA prosecuted in the last 50 years, but it is "inconceivable that it was only violated twice. The country handles its unconstitutionality by not prosecuting." The U.S., he said, is like a weird hybrid of Athens and Sparta, favoring both being open and being war-like and secretive. These two positions have never been reconciled, so we live in an unstable situation that favors both.

He also discussed the case of Wen Ho Lee, a scientist from Taiwan who worked at Los Alamos National Laboratory, who took home items that were classified as "PARD" (protect as restricted data), even though everyone is trained repeatedly that you "Don't take PARD home." When he was caught, Laughlin said, he said "I didn't know it was wrong" and "I thought they were going to fire me, so I took something home to sell." The latter sounds like an admission of guilt. He was put into solitary confinement for a year (actually 9 months) and then the case of 50 counts of AEA violations was dropped. Laughlin characterized this as "extralegal punishment," and said "we abolish due process with respect to nuclear data." (Wen Ho Lee won a $1.5 million settlement from the U.S. government in 2006 before the Supreme Court could hear his case. Somehow, this doesn't seem to me to be a very effective deterrent.)

Laughlin said that we see a tradeoff between risk and benefit, not an absolute danger. The risk of buildings being blown up is low enough to allow diesel fuel and fertilizer to be legal. Bombs from ammonium nitrate and diesel fuel are very easy to make, and our protection isn't hiding technical knowledge, but that people just don't do it. But nuclear weapons are so much more dangerous that the technical details are counted as absolutely dangerous, no amount of benefit could possibly be enough. He said that he's writing a book about energy and "the possible nuclear renaissance unfolding" (as a result of need for non-carbon-emitting energy sources). He says the U.S. and Germany are both struggling with this legal morass around nuclear information. (Is the unavailability of nuclear knowledge really the main or even a significant issue about nuclear plant construction in the United States? General Electric (GE Energy) builds nuclear plants in other countries.)

Laughlin said that long pointy knives could be dangerous, and there's a movement in England to ban them. Everybody deals with technical issue of knowledge and where to draw lines. (Is it really feasible to ban knives, and does such a ban constitute a ban on knowledge? How hard is it to make a knife?)

At this point he moved on to biology, and showed a photograph of a fruit fly with legs for antennae. He said, "so maybe antennae are related to legs, and a switch in development determines which you get. The control machinery is way too complicated to understand right now." (Really?) "What if this was done with a dog, with legs instead of ears. Would the person who did that go to Stockholm? No, they'd probably lose their lab and be vilified. In the life sciences there are boundaries like we see in nuclear--things we shouldn't know." (I doubt that there is a switch that turns dog ears into legs, and this doesn't strike me as plausibly being described as a boundary on knowledge, but rather an ethical boundary on action.) He said, "There are so many things researchers would like to try, but can't, because funders are afraid." Again, I suspect that most of these cases are ethical boundaries about actions rather than knowledge, though of course there are cases where unethical actions might be required to gain certain sorts of knowledge.

He turned to stem cells. He said that the federal government effectively put a 10-year moratorium on stem cell research for ethical reasons. Again, these were putatively ethical reasons regarding treatment of embryos, but the ban was on federally funded research rather than any research at all. It certainly stifled research, but didn't eliminate it.

Next he discussed the "Millennium Digital Copyright Act" (sic). He said that "people who know computers laugh at the absurdity" of claiming that computer programs aren't formulas and are patentable. He said that if he writes a program that "has functionality or purpose similar to someone else's my writing it is a violation of the law." Perhaps in a very narrow case where there's patent protection, yes, but certainly not in general. If he was arguing that computer software patents are a bad idea, I'd agree. He said "Imagine if I reverse-engineered the latest Windows and then published the source code. It would be a violation of law." Yes, in that particular example, but there are lots of cases of legitimate reverse engineering, especially in the information security field. The people who come up with the signatures for anti-virus and intrusion detection and prevention do this routinely, and in some cases have actually released their own patches to Microsoft vulnerabilities because Microsoft was taking too long to do it themselves.

He said of Microsoft Word and PDF formats that they "are constantly morphing" because "if you can understand it you can steal it." But there are legal open source and competing proprietary software solutions that understand both of the formats in question--Open Office, Apple's Pages and Preview, Foxit Reader, etc. Laughlin said, "Intentional bypassing of encryption is a violation of the DMCA." Only if that encryption is circumvention of "a technological measure that effectively controls access to" copyrighted material and the circumvention is not done for the purposes of security research, which has a big exception carved out in the law. Arguably, breakable encryption doesn't "effectively control access," though the law has certainly been used to prosecute people who broke really poor excuses for encryption.

Laughlin put up a slide of the iconic smiley face, and said it has been patented by Unisys. "If you use it a lot, you'll be sued by Unisys." I'm not sure how you could patent an image, and while there are smiley face trademarks that have been used as a revenue source, it's by a company called SmileyWorld, not Unisys.

He returned to biology again, to talk briefly about gene patenting, which he says "galls biologists" but has been upheld by the courts. (Though perhaps not for many years longer, depending on how the Myriad Genetics case turns out.) Natural laws and discoveries aren't supposed to be patentable, so it's an implication of these court decisions that genes "aren't natural laws, but something else." The argument is that isolating them makes them into something different than what they are when they're part of an organism, which somehow constitutes an invention. I think that's a bad argument that could only justify patenting the isolation process, not the sequence.

Laughlin showed a slide of two photos, the cloned dog Snuppy and its mother on the left, and a Microsoft Word Professional box on the right. He said that Snuppy was cloned when he was in Korea, and that most Americans are "unhappy about puppy clones" because they fear the possibility of human clones. I thought he was going to say that he had purchased the Microsoft Word Professional box pictured in Korea at the same time, and that it was counterfeit, copied software (which was prevalent in Korea in past decades, if not still), but he had an entirely different point to make. He said, about the software, "The thing that's illegal is not cloning it. If I give you an altered version, I've tampered with something I'm not supposed to. There's a dichotomy between digital knowledge in living things and what you make, and they're different [in how we treat them?]. But they're manifestly not different. Our legal system['s rules] about protecting these things are therefore confused and mixed up." I think his argument and distinction was rather confused, and he didn't go on to use it in anything he said subsequently. It seems to me that the rules are pretty much on a par between the two cases--copying Microsoft Word Professional and giving it to other people would itself be copyright infringement; transforming it might or might not be a crime depending on what you did. If you turned it into a piece of malware and distributed that, it could be a crime. But if you sufficiently transformed it into something useful that was no longer recognizable as Microsoft Word Professional, that might well be fair use of the copyrighted software. In any case in between, I suspect the only legally actionable offense would be copyright infringement, in which case the wrongdoing is the copying, not the tampering.

He put up a slide of Lady Justice dressed in a clown suit, and said that "When you talk to young people about legal constraints on what they can do, they get angry, like you're getting angry at this image of Lady Law in a clown suit. She's not a law but an image, a logos. ... [It's the] root of our way of relating to each other. When you say logos is a clown, you've besmirched something very fundamental about who you want to be. ... Legal constraints on knowledge is part of the price we've paid for not making things anymore." (Not sure what to say about this.)

He returned to his earlier allusion to slavery. He said that was "a conflict between Judeo-Christian ethics and what you had to do to make a living. It got shakier and shakier until violence erupted. War was the only solution. I don't think that will happen in this case. [The] bigger picture is the same kind of tension. ... Once you make Descartes a joke, then you ask, why stay?" He put up a slide of a drawing of an astronaut on the moon, with the earth in the distance. "Why not go to the moon? What would drive a person off this planet? You'd have to be a lunatic to leave." (I thought he was going to make a moon-luna joke, but he didn't, unless that was it.) "Maybe intellectual freedom might be that thing. It's happened before, when people came to America." He went on to say that some brought their own religious baggage with them to America. Finally, he said that when he presents that moon example to graduate students, he always has many who say "Send me, I want to go."

And that's how his talk ended. I was rather disappointed--it seemed rather disjointed and rambling, and made lots of tendentious claims--it wasn't at all what I expected from a Nobel prizewinner.

The first question in the Q&A was one very much like I would have asked, about how he explains the free and open source software movement. Laughlin's answer was that he was personally a Linux user and has been since 1997, but that students starting software companies are "paranoid about having stuff stolen," and "free things, even in software, are potentially pernicious," and that he pays a price for using open source in that it takes more work to maintain it and he's constantly having to upgrade to deal with things like format changes in PDF and Word. There is certainly such a tradeoff for some open source software, but some of it is just as easy to maintain as commercial software, and there are distributions of Linux that are coming closer to the ease of use of Windows. And of course Mac OS X, based on an open source, FreeBSD-derived operating system, is probably easier for most people to use than Windows.

I think there was a lot of potentially interesting and provocative material in his talk, but it just wasn't formulated into a coherent and persuasive argument. If anyone has read his book, is it more tightly argued?

Roger Pielke Jr. on climate change adaptation

A few hours after hearing Roger Pielke Jr. speak on climate change mitigation to CSPO, I heard him speak about climate change adaptation to a joint meeting of my seminar on human dimensions of climate change and another seminar with Dan Sarewitz, CSPO's director.

Like his previous talk, Pielke began this one with a slide on his positions, which was something like this:
  • Strong advocate of mitigation and adaptation.
  • He accepts the science of the IPCC.
  • There are other reasons behind impacts of climate--effects of inexorable development and growth.
  • The importance and challenge of climate change does not justify misrepresenting the science of adaptation--yet this happens on a regular basis (I’ll give a few examples).
  • We might choose to mitigate, but we will adapt.
He said (as he did in the earlier talk) that he has no disagreements with the science of IPCC working group I, lots of agreements with the economics and mitigation arguments of working group III (covered in the earlier talk), and some disagreements with the impacts, adaptation, and vulnerability arguments of working group II, which will be covered in this talk.

He then gave a slide of the outline of this talk:
  • The concept of adaptation is contested.
  • How we think about adaptation shapes how we think about research and policy.
  • Under the FCCC (Kyoto Protocol), adaptation is defined narrowly--as adaptation to climate change caused by the emissions of greenhouse gases.
  • The narrow definition creates a bias against adaptation.
  • Regardless, the primary factors underlying climate impacts on society are the result of development and growing wealth and vulnerability.
There are different definitions of "climate change" used between the IPCC and the UN FCCC. The IPCC defines it as "...change arising from any source," while the FCCC defines it as "...a change of climate which is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and which is in addition to natural climate variability observed over comparable time periods."

On the former definition, if the sun causes the earth to warm, which causes climate change effects, that's a climate change. On the latter, it's not. The latter definition restricts climate change to impacts caused by human-caused changes to greenhouse gases in the atmosphere.

Adaptation under the logic of the FCCC is that any increase of atmospheric carbon above 450 ppm (a 2 degree Centigrade temperature increase) is "dangerous" climate change that requires human adaptation. If we happen to stabilize at 449 ppm, then no adaptation at all is required. Under this definition, the more adaptation we need, the more we have failed in climate policy.

Under the IPCC's cost-benefit analysis, adaptation is considered a cost with no benefits.

Al Gore's Earth in the Balance calls adaptation a "laziness."

Tim Flannery, author of The Weather Makers, says that adaptation is "genocide."

IPCC's working group I uses the IPCC definition of climate change; working group III uses the FCCC definition; working group II shifts back and forth between the two.

But climate impacts are caused by a combination of effects: vulnerability (with sociological and ecological components) and by climate change and variability (which includes natural internal and natural external components, human greenhouse gas changes, and non-human greenhouse gas changes). In order to deal with those impacts, you can back up the causal chain to each of those causes, from the IPCC perspective. But from the FCCC perspective, it's as though none of those other factors are available except for the human contribution to greenhouse gases.

Why did the FCCC use this definition? Because the UN already has other frameworks for disaster preparedness, water management, desertification prevention, and biodiversity prevention, and they didn't want any overlap of responsibility.

Choice of definition of climate change can thus create a bias against adaptation, and puts science in impossible situations (requiring conclusive attribution of cause on human greenhouse gases). In reality, adaptation has broad benefits, such as contributing to sustainable development.

The Global Environment Facility of the UN, which releases funds for adaptation, will only pay out in proportion to effects caused by human greenhouse gases. Because of this requirement for attribution of cause, very little has been paid out. Oxfam said that the UNFCCC's global spending from the GEF is equal to what the UK spends annually on flood defense. If a developing nation has a disaster attributable to climate change and asks for funds, it is required to provide evidence for the percentage of damage attributable to climate change caused by human-produced greenhouse gases. One effect of this is that governmental spokespersons are likely to make such attributions in the media.

Swiss Re did a report on adaptation in the broad sense, without regard to attribution of cause, and added up deaths from natural disasters to get a total of $50T and 850,000 lives over 50 years; CNN reported this as meaning that human greenhouse gases caused all of that damage and death.

Another problem with the narrow definition is illustrated by malaria scenarios. Jeffrey Sachs, 2003, projects that without malaria, African GDP might be 3%/year higher. If you plug that into the Kaya Identity, emissions would be about 17 GtC vs. less than 1 today, by 2050. Without malaria mitigation, emissions will not even hit 6 GtC by 2050. The IPCC's projections presuppose that malaria will be unmitigated, which seems to be NOT how we should be thinking about climate policy.

Pielke argued that the broader notion of climate change and broader notion of adaptation are more useful. Adaptation is not in opposition to mitigation, and it has benefits as well as costs. In reality, we don't care just about greenhouse gases, we care about the impacts regardless of cause. By drawing a circle around human contributions to greenhouse gases and setting goals that focus only on that, we've engaged in "goal substitution," where addressing a single cause has become our goal instead of addressing the effects.

He then put up a slide of various book and magazine covers, as well as the poster for "An Inconvenient Truth," and said that "hazards are a centerpiece of the climate debate." One of the magazine covers, an issue of Newsweek from January 1996 with a cover story labeled "The Hot Zone: Blizzards, Floods, and Hurricanes, Blame Global Warming," was what got Pielke interested in doing research. The period 1991-1994, before that story, was a very quiet period for hurricanes hitting the U.S., but also the most expensive in terms of damage. Although he didn't study blizzards, he did study floods and hurricanes, and said he found that "the biggest signal in disasters wasn't climate."

Pielke then wanted to explain how his research has been used by the IPCC and the Bush and Obama administrations, looking at two reports: Climate Change 2007: Impacts, Adaptation, and Vulnerability from the IPCC (the report of working group II), and the U.S. Climate Change Science Program (CCSP) Report, Weather and Climate Extremes in a Changing Climate, Regions of Focus: North America, Hawaii, Caribbean, and U.S. Pacific Islands.

He gave this quote from the IPCC report:
1.3.8.5 Summary of disasters and hazards
Global losses reveal rapidly rising costs due to extreme weather-related events since the 1970s. One study has found that while the dominant signal remains that of the significant increases in the value of exposure at risk, once losses are normalised for exposure, there still remains an underlying rising trend.
He pointed out that the reference to "one study" is interesting, because he has published dozens of studies in this area, none of which show such a trend. The study in question mentioned here is "Muir Wood, et al., 2006," which is by R. Muir Wood, S. Miller, and A. Boissonade, titled "The search for trends ..." which is one of 24 papers commissioned as background by Peter Hoppe and Pielke for a workshop they conducted with experts from multiple countries, Munich Re, the Tyndall Centre, NSF, etc. The plan for that workshop was to be a "dissensus consensus," to identify areas of disagreement for further study, but they ended up reaching consensus on 20 statements.

The motivation for the workshop was a graph from Munich Re that showed that the cost of disasters, adjusted for inflation, has been increasing. The workshop wanted to find out what was causing this to happen and whether any percentage of it could be attributed to climate change.

The types of disasters in question were:
  • Earthquake, tsunami, and volcano, which couldn't be attributed to climate change.
  • Windstorms and floods, which could possibly be attributed to climate change and have been responsible for most of the increasing damage.
  • Disasters of temperature extremes such as heatwaves, drought, and wildfires, which could also be attributed to climate change, but which aren't responsible for most of the increasing damage.
Three of the consensus statements agreed to by all participants, including Muir Wood, were:
  1. Analyses of long-term records of disaster losses indicate that societal change and economic development are the principal factors responsible for the documented increasing losses to date.
  2. Because of issues related to data quality, the stochastic nature of extreme event impacts, length of time series, and various societal factors present in the disaster loss record, it is still not possible to determine the portion of the increase in damages that might be attributed to climate change due to GHG emissions.
  3. In the near future the quantitative link (attribution) of trends in storm and flood losses to climate changes related to GHG emissions is unlikely to be answered unequivocally.
The first statement is accurately reflected in the IPCC statement, but the second is exactly the opposite of what it says.

The Muir Wood paper itself says that if you look at the period 1970-2005, you have an upward trend that can't be attributed to just societal factors. But 2005 was the year of Hurricane Katrina, and 1970-1974 was a period when the Atlantic was very quiet. If you look at 1950-2005, there is no trend, Pielke said. The IPCC not only took a single background paper from the workshop, they actually took a subset of the paper's data to draw their conclusion.

Pielke argued that the damage trends can't be due to storm intensity alone, based on a graph of major category 3, 4, and 5 hurricanes vs. year. The 177 U.S. coastal counties have seen huge population growth--for example, the population of Harris County, Texas in 2005 was equal to the entire U.S. coastal population from Florida to South Carolina in 1955.

He showed comparison photos of Miami Beach in 1926 vs. 2006, and then a graph of the estimated amount of U.S. damage per year if every hurricane season had occurred with 2005 population levels. That graph shows a huge spike in 1926, when a big hurricane hit Miami; it would have been 1.5 to 2 times the damage of Katrina. 2004 and 2005 were also years of very high damage, though not as high as 1926.

The trend, Pielke said, is no statistical change in damage since 1900, and is consistent with the physical characteristics of hurricanes at landfall over that same time period. Other signals show up in the data, such as El Nino. When the Pacific is cold you get more hurricanes, when it's warm you get fewer. 1927-1969 were very active for hurricanes, the 1970's and 80's were not very active. He said there have been two independent replications of the same results with different data sets and methodologies, and that insurance and reinsurance companies use this for their pricing models.

His summary slide said this:
  • Raw damages are increasing.
  • Normalized damages show no trend, consistent with the lack of trend in landfall
  • Increases in inflation, wealth, and development along the coastline account for increasing damages.
  • While coastal development in hurricane-prone regions is increasing, in aggregate it appears to be proportional to the rest of the United States, with large local variations.
It occurred to me that one factor that might counteract a genuine increase in storm intensity with respect to damage would be better construction, but I didn't raise the issue since I figured it would have been unlikely for such a factor to exactly offset storm intensity increases so that there was no trend. Afterward, though, I found this paper by Judy Curry (PDF) which argues that improvements in building codes have just such an effect, and that the pre-1930 data Pielke uses was a time of inflated property values before the Great Depression, and if you take it out you get an upward trend again.

In response to a student question about whether probabilities of landfall have changed, Pielke said that the overall odds of hurricane landfall are unchanged within the data set (though there are subsets where it is different) and that studies of the west coast of Mexico, South Korea, China, Japan, Southeast Asia, Africa, and Madagascar show no regions where hurricane landfalls have increased.

He reported three other studies which have shown no upward trend in normalized weather losses--a study of his own done with the head of the Cuba Weather Service, for 1900-1998 (Pielke et al., 2003), one for Australia for 1965-2005 (Crompton & McAneney, 2007), and one for India for a time period I didn't catch (Raghavan & Sen Sarma, 2003). He said there are about 15 other studies of the same sort, and that Lawrence Bauer of the Free University of Amsterdam has a review paper of all of these studies that is under review for publication.

When you look at U.S. flood losses, after adjusting for societal factors, there has been a slight (not statistically significant) downward trend in losses.

Pielke then said that he took a bunch of weather loss data sets, standardized them, took ten-year averages and overall averages, and then put them all on top of each other. These data sets included Munich Re's global losses for 1979-xx (I didn't catch the end year), U.S. flood losses, and Australian weather losses. While Munich Re's global losses correlate strongly with U.S. hurricane losses (0.80, 64% of the variance in global losses explained by U.S. losses), Pielke said, "there's no secular trend over the time period for which we have these data sets."

Regarding hurricanes, however, Pielke said his data is consistent with hurricanes becoming more intense. He referred to Kerry Emanuel's 4 August 2005 paper in Nature, titled "Increasing destructiveness of tropical cyclones over the past 30 years," which was featured in "An Inconvenient Truth." He showed a graph from the paper which shows windspeed cubed, or power dissipation index (PDI), has increased. Pielke noted that this is not a measure of "destructiveness," and the paper says nothing about destruction caused by hurricanes. He broke the Atlantic basin into five equal compartments with an equal number of observations of hurricane intensity (windspeed measurement) from the 1880s to the present, for all named storms, 39 knots and higher. He found that the strongest upward trends are farthest out to sea, and no trends in the locations where damage actually occurs.

He said he did the same with Emanuel's graph and got the same result, that all of the trends are out to sea. So, he argued, Emanuel's results could be due to real changes in storm intensity as a result of ocean temperature changes, or they could be due to increased storm counts due to more and better data collection out at sea. He submitted a letter to Geophysical Research Letters reporting his result, which was rejected with negative reviews that said "everybody already knows this." But, Pielke said, Emanuel didn't know it until he pointed it out to him.

Next, Pielke talked about the U.S. CCSP Report, which spanned the Bush and Obama administrations. This report said the following about U.S. extreme weather events:
  1. Over the long-term U.S. hurricane landfalls have been declining.
  2. Nationwide there have been no long-term increases in drought.
  3. Despite increases in some measures of precipitation, there have not been corresponding increases in peak streamflows (>90th percentile).
  4. There have been no observed changes in the occurrence of tornadoes or thunderstorms.
  5. There have been no long-term increases in strong East Coast winter storms, called Nor’easters.
  6. There are no long-term ...
With these conclusions, he said, you'd expect no claims of increasing losses from damage. But the report says:
Extremes are already having significant impacts on North America. ... both the climate and the socioeconomic vulnerability to weather and climate extremes are changing (Brooks and Doswell, 2001; Pielke et al., 2008; Downton et al., 2005).
Two of the three papers cited have Pielke as author or co-author, and the third applies his sort of methodology to tornadoes. The Harold E. Brooks and Charles A. Doswell III 2001 paper says: "We find nothing to suggest that damage from individual tornadoes has increased through time, except as a result of the increasing cost of goods and accumulation of wealth in the United States." The Pielke et al. 2008 paper finds no trends in absolute data or under a logarithmic transformation. The Downton, Miller, and Pielke 2005 paper talks about the National Weather Service flood loss database, says absolutely nothing about climate change, and shows a drop in losses. So of the three cited papers for the claim, two say the opposite of the claim and one is silent. Pielke says there is no published study that supports the claim. When he made a stink about this, he said he ended up being called a climate change denier. The IPCC and CCSP are supposed to be places we go to get reliable information, he said, and "I'm much more willing to listen to others who say their work was misrepresented since I know mine was."

In 2000, he co-authored an article with Dan Sarewitz on "Breaking the Global Warming Gridlock" in The Atlantic Monthly that argued for getting people engaged with adaptation rather than focusing exclusively on mitigation. After that came out, he says he was told privately by a representative of an environmental group that "we agree with what you say, but it's not helpful now because we're trying to win a [political] battle on mitigation."

He pointed out two recent cases of people in government being silenced for speaking out contrary to policy--David Nutt, the UK drug policy advisor, who was fired for saying that ecstasy use was of comparable risk to riding horses (and ecstasy is safer to give to a stranger than peanuts), and Clive Spash, an economist for Australia's CSIRO, who submitted a paper to a journal critical of cap and trade, which was accepted for publication but withdrawn when his supervisor wrote to the journal and asked for it to be retracted.

He asked, "If the public loses faith in the connection between authoritative scientific statements and policy, then what do we rely upon to make decisions?"

He suggested that we need to improve processes where there is potential for intellectual conflicts-of-interest, such as where people with a stake in an assessment highlight their own research over other research they don't favor. He thinks this doesn't seem to be a problem with IPCC working group I, but has been a problem with both working groups II and III and with the CCSP. In both of the cases he referred to regarding his own work, above, he said a single person was responsible (not the same person in both cases, but one in each).

I left about ten minutes before the end of the class and so missed any further wrapup, as I had to get to the opposite side of campus for another talk, by Robert B. Laughlin, one of the winners of the 1998 Nobel prize in physics.

UPDATE (24 September 2013): Michael Mann, on Twitter, called Pielke Jr.'s work on storm damage "deeply flawed work" because its "normalization procedure removes climate change signal," and pointed to this critique by Grinsted.

UPDATE (13 July 2014): An updated version of the information in this talk is found in Ch. 7 of Pielke Jr.'s book, The Climate Fix (2010, Basic Books).

Friday, November 06, 2009

Roger Pielke Jr. on climate change mitigation

Yesterday I heard Roger Pielke Jr. speak twice at Arizona State University, first in a talk to the Consortium for Science, Policy and Outcomes (CSPO) on climate change mitigation, and second in a class lecture on climate change adaptation. This post is about the former.

His talk was entitled "The Simple Math of Emissions Reduction," and began with a quote from Steve Raymer of Oxford University:
Wicked Problems
have Clumsy Solutions
requiring Uncomfortable Knowledge
which he then followed up with a slide on "Where I stand," which included the following bullet points (nearly, but probably not exactly verbatim):
  • Strong advocate for mitigation and adaptation policies
  • Continuing increase in atmospheric CO2 could pose large risks, as described by IPCC
  • Stabilizing concentrations at low levels can’t succeed if we underestimate the challenge (and we have)
  • Mitigation action will not result from elimination of all scientific uncertainty
  • Poisonous politics of the climate debate serves to limit a broader discussion of options
  • Ultimately technological innovation will precede political action, not vice versa
Regarding the IPCC, he says he has no debate with working group I on the science, some disagreements with working group II on impacts, adaptation, and vulnerability, and lots of debate with working group III on economics and mitigation, which this talk covers.

His slide for the outline of his talk looked like this:
  • Understanding the mitigation challenge
  • Where do emissions come from?
  • Decarbonization
  • The UK as a cautionary tale for U.S. policymakers
  • The U.S. situation and Waxman-Markey/Boxer-Kerry
  • How things might be different
Understanding the mitigation challenge

Although climate change involves other greenhouse gases besides CO2, he focused on CO2 and in this part of the talk gave a summary of CO2 accumulation in the atmosphere as a stock and flow problem, using a bathtub analogy. The inflow of CO2 into the atmosphere is like water pouring out of the faucet, there's outflow going out the drain, and the water in the tub is the accumulated CO2 in the atmosphere. The inflow is about 9 GtC (gigatons of carbon) per year and growing, and expected to hit 12 GtC per year by 2030. The current stock is a concentration of about 390 parts per million (ppm), increasing by 2-3 ppm/year. And the outflow is a natural removal of about 4 GtC/year. To stop the stock increase, the amount going in has to equal the amount going out. If we reach an 80% reduction in emissions by 2050, that is expected to limit the stock to 450 ppm.

Emissions have been growing faster than expected by the IPCC in 2000, with a 3.3% average increase per year between 2000 and 2007. While the economic slump has reduced emissions in 2009, it's expected that recovery and continued growth in emissions will occur.

Where do emissions come from?

Pielke used the following four lines to identify policy-relevant variables:
People
engage in economic activity that
uses energy
from carbon-emitting generation
The associated variables:
Population (P)
GDP per capita (GDP/P)
Energy intensity of the economy (Total Energy (TE)/GDP)
Carbon intensity of energy (C/TE)
The total carbon emissions = P * GDP/P * TE/GDP * C/TE.

This formula is known as the "Kaya Identity."

The policy tools available to reduce emissions by affecting these variables are: (1) population management to end up with fewer people, (2) limit the generation of wealth to have a smaller economy, (3) do the same or more with less energy by increasing efficiency, and (4) switch energy sources to generate energy with less emissions.

And that's it. Cap-and-trade, carbon taxes, etc. are designed to influence these variables.

Pielke then combined the first two variables (P * GDP/P) to get GDP, and the second two (TE/GDP * C/TE) he identified as Technology.

He argued that reducing GDP or GDP growth is not a policy option, so Technology is the only real policy option. Regarding the former point, he put up a graph very much like the Gapminder.org graph of world income, and observed that the Millennium Development Goals are all about pushing the people below $10/day--80% of the world's population--on that graph to the right. Even if all of the OECD nations were removed from the graph, there would still be a push to increase the GDP for the remainder and there would still be growing emissions.

He quoted Gwyn Prins regarding the G8 Summit to point out how policy makers are conflicted--they had a morning session on how to reduce gas prices for economic benefit, and an afternoon session on how to increase gas prices for climate change mitigation.

With this kind of a conflict, Pielke said, policy makers will choose GDP growth over climate change.

So that leaves Technology as an option, and he turned to the topic of decarbonization.

Decarbonization

Pielke put up a graph of CO2 emissions per $1,000 of GDP over time globally, which showed that there has been a steady improvement of efficiency. In 2006, emissions were 29.12 GtC, divided by $47.267 trillion of GDP, gives 0.62 tons of CO2 per $1,000 GDP. In 1980, that was above 0.90 tons of CO2 per $1,000 GDP.

Overall emissions track GDP, and the global economy has become more and more carbon intensive.

He looked at carbon dioxide per GDP (using purchasing power parity (PPP) for comparison between countries) for four different countries, Japan, Germany, U.S., and China (that's ordered from most to least efficient). Japan hasn't changed much over time, but is very carbon efficient (below 0.50 tons of CO2 per $1,000 GDP). Germany and the U.S. are about the same slightly above 0.50 tons of CO2 per $1,000 GDP, and both have improved similarly over time. China has gotten worse from 2002-2006 and is at about 0.75 tons of CO2 per $1,000 GDP.

He put up a slide of the EU-15 countries decarbonization rates pre- and post-Kyoto Protocol, and though there was a gap between them, the slopes appeared to be comparable. For the first ten years of Kyoto, then, he said, there's no evidence of any improvement in the background rate of decarbonization. The pre-Kyoto rate was from above 0.55 tons of CO2 per $1,000 GDP to about 0.50 tons of CO2 per $1,000 GDP. The post-Kyoto rates went from about 0.50 tons of CO2 per $1,000 GDP to below 0.45 tons of CO2 per $1,000 GDP.

At this point, Clark Miller (head of my program in Human and Social Dimensions of Science and Technology) pointed out that given Japan, there is no reason to assume that there should have been a continuing downward trend at all, but Pielke reiterated that since the slopes appeared to be the same there's no evidence that Kyoto made a difference.

The UK as a cautionary tale for U.S. policymakers

Pielke identified the emissions targets of the UK Climate Change Act of 2008:

Average annual reductions of 2.8% from 2007 to 2020, to reach 42% below 1990 levels by 2020.

Average annual reductions of 3.5% from 2020, to reach 80% below 1990 levels by 2050.

The former target of 42% below 1990 levels is contingent upon COP15 reaching an agreement this December; otherwise the unilateral target is 34% below 1990 levels.

Pielke showed a graph of the historical rate of decarbonization for the UK economy, and compared it to graphs of manufacturing output and manufacturing employment, observing that the success of decarbonization of the UK economy from 1980-2006 has been due primarily to offshoring of manufacturing, something that's not sustainable--once they reach zero, there's nowhere further down to go.

He then used France as a point of comparison, since it has the lowest CO2/GDP output of any developed country, due to its use of nuclear power for most of its energy--it's at 0.30 tons of CO2 per $1,000 GDP, and a lot of that is emissions from gasoline consumption for transportation.

It took France about 22 years, from 1984-2006, to get its emissions to that rate.

For the UK to hit its 2020 target, it needs to improve to France's rate in the next five years, by 2015. That means building 30 new nuclear power plants and reducing the equivalent coal and gas generation; Pielke said he would "go out on a limb" and say that this won't happen.

That will only get them 1/3 of the way to their 2020 goals.

The UK plan calls for putting 1.7 million electric cars on the road by 2020, which means doubling the current rate of auto sales and selling only electric cars.

For the entire world to reach France's level of efficiency by 2015 would require a couple of thousand nuclear power plants.

The U.S. situation and Waxman-Markey/Boxer-Kerry

The U.S., said Pielke, has had one of the highest rates of sustained decarbonization, from 1980-2006, going from over 1.00 tons of CO2 per $1,000 GDP to the current level of about 0.50 tons of CO2 per $1,000 GDP.

The Waxman-Markey target is an 80% reduction by 2050, not quite as radical as the UK.
The Boxer-Kerry target is a 17% reduction by 2020.

Pielke broke down the current U.S. energy supply by source in quadrillions of BTUs (quads), and pointed out that he got all of his data from the EIA and encouraged people to look it up for themselves:
Petroleum: 37.1
Natural gas: 23.8
Coal: 22.5
Renewable: 7.3
Nuclear: 8.5
Total energy was about 99.2 quads in 2007, of which 83.4 came from coal, natural gas, and petroleum.

Emissions by source:
Coal: 95 MMt CO2/quad
Natural gas: 55 MMt CO2/quad
Petroleum: 68 MMt CO2/quad
Multiply those by the amount of energy produced by each source and add them up:
95 * 22.5 + 55 * 23.8 + 68 * 37.1 = 5,969 MMt CO2
The actual total emissions were at about 5,979, so the above back-of-the-envelope calculation was pretty close.

In 2009, U.S. energy consumption will be about 108.6 quads, of which 21 quads will come from renewables and nuclear (40% growth from 2007), which leaves 87.2 quads from fossil fuels, a 4.6% increase from 2007.

If we substituted natural gas for all coal, then our 2020 emissions would be 5,300 MMt CO2, higher than the 2020 target and 12% below 2005, and would still lock us into a carbon intensive future.

In order to meet targets, we need to reduce coal consumption by 40%, or 11 quads, and replace that with renewables plus nuclear, plus an additional 3.8 quads of growth by 2020.

One quad equals about 15 nuclear plants, so 14.8 quads means building 222 new nuclear plants (on top of the 104 that are currently in the U.S.).

Or, alternatively, assuming 100 concentrated solar power installations * 30 MW peak per quad, 1,480 such installations for 14.8 quads, or one online every two days until 2020.

Or, assuming 37,500 * 80 kW peak wind turbines per quad, 555,000 such wind turbines for 14.8 quads, or one 150-turbine wind farm brought online daily until 2020.

To reach these targets with wind and solar would require increasing them by a factor of 37 by 2020; Obama has promised only a tripling.

Could we meet the targets by increasing efficiency of our energy consumption? We would have to reduce total energy consumption to 85.5 quads by 2020 (rather than 108.6), about equal to U.S. energy consumption in 1992, when the U.S. economy was 35% smaller than in 2007. That would be improving efficiency by about a third.

How fast can decarbonization occur? We don't know, because no one has really set out to intentionally do that. Historical rates have been 1-2% per year by developed countries; for short periods, some countries have exceeded 2% per year. Japan, from 1981-1986, improved by over 4% per year.

Pielke argued that these targets are not feasible targets in the U.S. or UK, and so policy makers are adding safety valves, offsets, and other mechanisms to allow some manipulation to give the appearance of success. Achieving 80% reduction in global emissions by 2050 requires > 5% decarbonization per year.

The problem, Pielke argued, is that the policy logic of targets and timetables is backwards, and we should focus on improving efficiency and decarbonization rather than emissions targets.

How things might be different

Pielke's suggested alternative strategy was presented in a slide something like this:
  • Focus policy on decarbonization of the economy (not simply emissions)
  • Efficiency gains (follow the Japanese model, “frontrunner program” by industry, look at best performer and set it as regulatory standard)
  • Expand carbon free energy (low carbon tax, other policies--subsidies, regulation, etc.)
  • Innovation-focused investments
  • To create ever advancing frontier of potential efficiency gains
  • Air capture backstop
  • Adaptation
The Japanese "frontrunner" program was where the government went industry by industry, identified the most efficient company in each industry, and set regulations to make that company the baseline standard for the other companies to meet.

Pielke argued that there should be a carbon tax of, say, $5/ton (or whatever is the "highest price politically possible"), with the collected funds (that would raise about $700B/year) used to promote innovation in energy efficiency.

If we find that we're stabilizing at 635 ppm, we may want to "brute force" some removal of carbon from the atmosphere (e.g., geoengineering).

In the Q&A session, Clark Miller questioned Pielke about the impossibility of replacing our energy infrastructure quickly--if it costs $2.61B for a 1400 MW nuclear plant, we'd need 65 of them (fewer than Pielke's number, he assumed smaller plants) at a cost of $260B. Since there is capital floating around causing asset bubbles in the trillions, and the energy industry is expected to become a $15T industry, surely there would be some drive to build them if they're going to become profitable. (Not to mention peak oil as a driver.) He agreed that it would take longer to construct these, but asked what the upshot would be if this was done by, say, 2075.

Also in the Q&A, Pielke pointed out that in a previous presentation of this talk, a philosophy professor had suggested that the population variable could be affected by handing out cyanide pills. (Or by promoting the growth of the Church of Euthanasia.) What I didn't mention above was that Pielke also briefly discussed improvements to human lifespan, and in his other talk (summary to come), he talked about how the IPCC's projections assume that we will not try to eradicate malaria...

ADDENDUM (November 7, 2009): I've seen estimates that U.S. carbon emissions will be about 6% lower in 2009 as a result of the recession, which amounts to considerable progress towards the Boxer-Kerry target. Projections of an economic recovery in 2010 strike me as overly optimistic; in my opinion there's a strong possibility that we haven't hit bottom yet and there's worse to come. Still, though, I think Pielke's probably right that energy consumption will go right back up again unless the recession becomes a depression and results in significant changes in consumption habits.

My summary of Pielke's lecture on climate change adaptation is here.

ADDENDUM (November 9, 2009): It should be noted that Roger Pielke, Jr. is a somewhat controversial figure in the climate change debate, and believed by many in the climate change blogosophere to be in the climate change skeptic camp, or to be biased towards them in terms of where he levels his criticisms. A post titled "Who Framed Roger Pielke?" from the Only In It For the Gold blog links to a number of opinions expressing these views.

UPDATE (February 5, 2010): A post titled "The Honest Joker" at Rabett Run critiques Pielke Jr.'s stance as an "honest broker" as a sham.

UPDATE (August 28, 2010): A talk by Pielke that appears to have some similarity to this one may be found here.

UPDATE (July 13, 2014): An updated version of the information in this talk is Ch. 3 of Pielke Jr.'s book, The Climate Fix (2010, Basic Books).

Charles Phoenix's retro slide show--in Phoenix


Tonight and tomorrow night at 8 p.m., Charles Phoenix will bring his Retro Slide Show Tour to the Phoenix Center for the Arts at 1202 N. 3rd St.

I've not seen his show before, but I've enjoyed his blog's slide-of-the-week feature and plan to go see this.

Here's the official description:

A laugh-out-loud funny celebration of '50s and '60s road trips, tourist traps, theme parks, world's fairs, car fashion fads, car culture and space age suburbia, will also include a selection of vintage images of the Valley of the Sun.

Click the above link for more details or to buy tickets.

Wednesday, November 04, 2009

Where is the academic literature on skepticism as a social movement?

Here's all I've been able to find so far, independent of self-descriptions from within the movement (and excluding history and philosophy of Pyrrhonism, Academic Skepticism, the Carvaka, the Enlightenment, British Empiricism, and lots of work on the development of the enterprise of science):
  • George Hansen, "CSICOP and the Skeptics: An Overview," The Journal of the American Society for Psychical Research vol. 86, no. 1, January 1992, pp. 19-63. I've not seen a more detailed history of contemporary skepticism elsewhere.
  • Stephanie A. Hall, "Folklore and the Rise of Moderation Among Organized Skeptics," New Directions in Folklore vol. 4, no. 1, March 2000.
  • David J. Hess, Science in the New Age: The Paranormal, Its Defenders and Debunkers, and American Culture, 1993, The University of Wisconsin Press.
I note that Paul Kurtz's The New Skepticism: Inquiry and Reliable Knowledge (1992, Prometheus Books) puts contemporary skepticism in the lineage of several of the other forms of philosophical skepticism I mentioned above, identifying his form of skepticism as a descendant of pragmatism in the C.S. Peirce/John Dewey/Sidney Hook tradition (and not the Richard Rorty style of pragmatism). But I think that says more about Kurtz than about the skeptical movement, which also draws upon other epistemological traditions and probably doesn't really have a sophisticated epistemological framework to call its own.

There's a lot of literature on parallel social movements of various sorts, including much about advocates of some of the subject matter that skeptics criticize, and some of that touches upon skeptics. For example:
  • Harry Collins and Trevor Pinch, "The Construction of the Paranormal: Nothing Unscientific is Happening," in Roy Wallis, editor, On the Margins of Science: The Social Construction of Rejected Knowledge, 1979, University of Keele Press, pp. 237-270.
  • Harry Collins and Trevor Pinch, Frames of Meaning: The Social Construction of Extraordinary Science, 1982, Taylor & Francis.
  • Ronald L. Numbers, The Creationists: From Scientific Creationism to Intelligent Design, 2nd edition, 2006, Harvard University Press.
  • Christopher P. Toumey, God's Own Scientists: Creationists in a Secular World, 1994, Rutgers University Press.
The Toumey book doesn't really have anything about skeptics, but is an anthropological study of creationists in the United States which describes the connection between "creationism as a national movement" and "creationism as a local experience" that seems intriguingly similar to the skeptical movement, especially in light of the fact (as I mentioned in my previous post) that national skeptical organizations are independent of established institutions of science that provide the key literature of the movement and at least implicitly assume that the average layman can develop the ability to discern truth from falsehood, at least within a particular domain, from that literature.

In some ways, the skeptical movement also resembles a sort of layman's version of the activist element in the field of science and technology studies, based on positivist views of science that are the "vulgar skepticism" dismissed in this article:
I think if contemporary skepticism wants to achieve academic respectability, it will need to develop a more sophisticated view of science that comes to terms with post-Popper philosophy of science and post-Merton sociology of science; my recommendation for skeptics who are interested in that subject is to read, as a start:
  • Philip Kitcher, The Advancement of Science: Science Without Legend, Objectivity Without Illusions, 1995, Oxford University Press.
There's an enormous relevant literature on those topics, an interesting broad overview is:
  • R.C. Olby, G.N. Cantor, J.R.R. Christie, and M.J.S. Hodge, Companion to the History of Modern Science, 1990, Routledge.
I welcome any new revelations about sources of relevance that I've missed, particularly if there is other academic work specifically addressing the history, philosophy, sociology, and anthropology of the contemporary skeptical movement--three sources ain't much.

UPDATE (September 27, 2014): Some additional works I recommend for skeptics:

  • Harry Collins, Are We All Scientific Experts Now?, 2014, Polity Press.  A very brief and quick overview of science studies with respect to expertise.
  • Massimo Pigliucci, Nonsense On Stilts: How to Tell Science from Bunk, 2010, University of Chicago Press. A good corrective to the overuse of Popper, easy read.
  • Massimo Pigliucci and Maarten Boudry, Philosophy of Pseudoscience: Reconsidering the Demarcation Problem, 2013, University of Chicago Press. Good collection of essays reopening the debate many thought closed by Larry Laudan on whether there can be philosophical criteria for distinguishing the boundary between science and pseudoscience.

What are the goals of Skepticism 2.0?

Yesterday I listened to D.J. Grothe's interview with Ben Radford on the Point of Inquiry podcast about the latest issue of the Skeptical Inquirer (November/December 2009) about "Skepticism 2.0," the bottom-up grassroots expansion of the skeptical movement through Internet communications tools like blogs, podcasts, online videos and forums, and the real-world activities that have become possible through them, like meetups and SkeptiCamps.

Near the end of the podcast, D.J. asked Ben what he thought would be the results of Skepticism 2.0 in five years time. He said (1) more skeptics and (2) more cooperative projects between the three major U.S. skeptical groups, the Committee for Skeptical Inquiry, the James Randi Educational Foundation, and the Skeptics Society.

That struck me as a rather disappointingly modest set of goals, as well as rather "old school" skepticism thinking, and insular. Surely we can come up with ideas for something more exciting, interesting, and useful than merely the self-perpetuation and growth of the skeptical movement and cooperation among the traditional top-down skeptical organizations over the next five years.

A few thoughts that came to my mind:
  • If skeptics want to promote public understanding of science and critical thinking, why not partnerships with other organizations that also have those purposes? The National Academies of Science, the National Center for Education, teacher's groups and school groups at a local level?
  • If skeptics want to promote the activity of science, why not look at ways to help motivate students to enter science as a career, and support them in doing so? I've previously suggested to Phil Plait that JREF might partly model itself after the Institute for Humane Studies, an organization which provides support for undergraduate and graduate students who favor classical liberal political ideals, in order to help them achieve success in careers of thought leadership, including academics, journalists, filmmakers, public policy wonks, and so on. In order for skepticism and critical thinking to have a significant impact, it's not necessary that everyone become a skeptic, only that a sufficient number of people in the right places engage in and encourage critical thinking.
  • If skeptics want to see more diversity in the skeptical movement, why not look at ways to reach out to other communities? The podcast did mention the SkepTrack at Dragon*Con, which is one of the most innovative ideas for outreach for skeptical ideas since the founding of CSICOP in 1976.
  • If skeptics want to act as a form of consumer protection against fraud and deception, why not try to find ways to interact with regulators, investigators, politicians, and the media to get fraudulent products and services off the market? The UK complaints against chiropractors making false claims on their websites as a response to the British Chiropractic Association libel lawsuit against Simon Singh, or the Australian complaint against bogus claims by anti-vaccinationists (though see my comment on that blog post for some reservations) might suggest some ideas.
It seems to me that the skeptical movement should be concerned about more than just increasing its own numbers and getting the existing national groups to work together. I think that Skepticism 2.0 has and will continue to force the existing groups to cooperate with each other and with the grassroots movement if they don't want to become obsolete and irrelevant. And at this point growth is, at least for the near-term, a foregone conclusion. But in order to continue to grow and thrive, there should be some goals that have something to do with being useful and making the world a better place, by which the skeptical movement can measure its effectiveness and success.

I'm sure readers of this blog have further suggestions. What else?

Addendum:

By the way, with regard to my first suggestion, here's a question that may provide some motivation and food for thought: Why do the Parapsychological Association and the National Center for Complementary and Alternative Medicine have better and more formal ties to official institutions of science than any skeptical organization? The PA is a member of the AAAS, and NCCAM is an agency within the National Institutes of Health. The main difference between those organization and skeptical organizations is that they actually do and publish peer-reviewed scientific research.

Tuesday, November 03, 2009

More Scientology exposure from the St. Pete Times

The St. Petersburg Times has published another three-part exposé on the Church of Scientology based on interviews with former high-level members. (The first three-part series from June is discussed here; I missed the second three-part series from August about new defectors; all three series may be found on the SP Times website here.)

Part 1 (October 31): "Chased by their church: When you leave Scientology, they try to bring you back"

An overview of this new, third series of exposures based on information from former high-ranking members of the Church of Scientology such as Mark "Marty" Rathbun and Mike Rinder.

The story of how the church commands and controls its staff is told by the pursuers and the pursued, by those who sent spies and those spied upon, by those who interrogated and those who rode the hot seat. In addition to Rathbun, they include:

• Mike Rinder, who for 25 years oversaw the church's Office of Special Affairs, which handled intelligence, legal and public affairs matters. Rinder and Rathbun said they had private investigators spy on perceived or potential enemies.

They say they had an operative infiltrate a group of five former Scientology staffers that included the Gillham sisters, Terri and Janis, two of the original four "messengers" who delivered Hubbard's communications. They and other disaffected Scientologists said they were spied on for almost a decade.

• Gary Morehead, the security chief for seven years at the church's international base in the desert east of Los Angeles. He said he helped develop the procedure the church followed to chase and return those who ran, and he brought back at least 75 of them. "I lost count there for awhile.''

Staffers signed a waiver when they came to work at the base that allowed their mail to be opened, Morehead said. His department opened all of it, including credit card statements and other information that was used to help track runaways.

• Don Jason, for seven years the second-ranking officer at Scientology's spiritual mecca in Clearwater, supervised a staff of 350. He said that after he ran, he turned himself in and ended up locked in his cabin on the church cruise ship, the Freewinds. He said he was held against his will.

Part 2 (November 2): "Scientology: What happened in Vegas"

How ex-members Terri and Janis Gillham, who had been Sea Org "messengers" for L. Ron Hubbard and whose legal guardian had been Hubbard's wife Mary Sue, had their mortgage business in Las Vegas infiltrated by spies working for the Church of Scientology to keep tabs on what they were up to. Mark Fisher, Scientology head David Miscavige's aide de camp for seven years, was spied on by the man he thought was his best friend.

Part 3 (November 3): "Man overboard: To leave Scientology, Don Jason had to jump off a ship"

After leaving the Church once and returning, Don Jason was put aboard the Freewinds, a Scientology ship, and monitored constantly. He managed to get off the ship in the Bahamas by effectively zip-lining down a cable with a home-made device, and getting on a plane to Milwaukee by way of Tampa and Atlanta. Someone from the Church booked the seat next to his, and Rathbun (still in the Church at the time) met him at Tampa, and then bought a ticket on his flight, to try to talk him into returning.

Sunday, November 01, 2009

More apparent plagiarism from Ian Plimer

Eli Rabett and Pieter Tans identified some errors in Ian Plimer's book's claim of selective data reporting from Mauna Loa measurements of atmospheric carbon, which Tim Lambert at the Deltoid ScienceBlog tracks to climate change skeptic Ferdinand Engelbeen. But Plimer doesn't cite Engelbeen, perhaps because Engelbeen also refutes the argument Plimer is trying to make.

This is not the first time Plimer has copied without quoting or citing sources--multiple instances in his book Telling Lies for God have previously been identified by Jeffrey Shallit and me.

(Previously on Plimer at this blog.)

Friday, October 30, 2009

Maricopa County Notices of Trustee's Sales for October 2009

I haven't posted one of these things in a while, so I figured it was about time.

The big peak was in March, with a total of 10,725 that month. October's total was 6,618.