Sunday, May 02, 2010

Politics and science in risk assessment

There’s a widespread recognition that public policy should be informed by both scientifically verifiable factual information and by social values.  It’s commonly assumed that science should provide the facts for policy-makers, and the policy-makers should then use those facts and social and political values of the citizens they represent to make policy.  This division between fact and value is institutionalized in processes such as a division between risk assessment performed by scientists concerned solely with the facts and subsequent risk management that also involves values, performed in the sphere of politics.  This neat division, however, doesn’t actually work that well in practice.

“Taking European Knowledge Society Seriously,” a 2007 “Report by the Expert Group on Science and Governance to the Science, Economy and Society Directorate, Directorate-General for Research” of the European Commission, spends much of its third chapter criticizing this division and the idea that risk assessment can be performed in a value-free way.  Some of the Report’s objections are similar to those made by Heather Douglas in her book Science, Policy, and the Value-Free Ideal, and her analysis of a topography of values is complementary to the Report.  The selection of what counts as input into the risk assessment process, for example, is a value-laden decision that is analogous to Douglas’ discussion of problem selection.  Health and safety concerns are commonly paramount, but other potential risks--to environment, to economy, to social institutions--may be minimized, dismissed, or ignored.  Selection of methods of measurement also can implicitly involve values, as also is observed by Douglas.  The Report notes, “health can be measured alternatively as frequency or mode of death or injury, disease morbidity, or quality of life,” and questions arise as to how to aggregate and weight different populations, compare humans to nonhumans, and future generations to present generations.

In practice, scientists tend to recognize questions of these sorts, as well as that they are value-laden.  This can lead to the process being bogged down by scientists wanting policy-makers to answer value questions before they perform their risk assessment, while policy-makers insist that they just want the scientific facts of the matter before making any value-based decisions.  Because science is a powerful justification for policy, it’s in the interest of the policy-maker to push as much as possible to the science side of the equation.  We see this occur in Congress, which tends to pass broad-brush statutes which “do something” about a problem but push all the details to regulatory agencies, so that Congress can take credit for action but blame the regulatory agencies if it doesn’t work as expected.  We see it in judicial decisions, where the courts tend to be extremely deferential to science.  And we see it within regulatory agencies themselves, as when EPA Administrator Carol Browner went from saying first that “The question is not one of science, the question is one of judgment” (Dec. 1996, upon initially proposing ozone standards) to “I think it is not a question of judgment, I think it is a question of science” (March 1997, about those same standards).  The former position is subject to challenge in ways that the latter is not.

In reality, any thorough system of risk management needs to be iterative and involve both scientific judgments about facts and political decisions that take into account values, taking care not to use values in a way to achieve predetermined conclusions, but to recognize what sets of interests and concerns are of significance.  This doesn’t preclude the standardization of methods of quantification and assessment, it just means that they need to be able to evolve in response to feedback, as well as to begin from a state where values are explicitly used in identifying what facts need to be assessed.

[A slightly different version of the above was written as a comment for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Tim K. for his comments.]

Thursday, April 29, 2010

Science fiction scenarios and public engagement with science

Science fiction has been a popular genre at least since Jules Verne’s 19th century work, and arguably longer still. But can it have practical value as well as be a form of escapist entertainment? Clark Miller and Ira Bennett of ASU suggest that it has potential for use in improving the capacity of the general public “to imagine and reason critically about technological futures” and for being integrated into technology assessment processes (“Thinking longer term about technology: is there value in science fiction-inspired approaches to constructing futures?" Science and Public Policy 35(8), October 2008, pp. 597-606).

Miller and Bennett argue that science fiction can provide a way to stimulate people to wake from “technological somnambulism” (Langdon Winner’s term for taking for granted or being oblivious to sociotechnical changes), in order to recognize such changes, realize that there may be alternative possibilities and that particular changes need not be determined, and to engage with deliberative processes and institutions that choose directions of change. Where most political planning is short-term and based on projections that simply extend current trends incrementally into the future, science fiction provides scenarios which exhibit “non-linearity” by involving multiple, major, and complex changes from current reality. While these scenarios “likely provide...little technical accuracy” about how technology and society will actually interact, they may still provide ideas about alternative possibilities, and in particular to provide “clear visions of desirable--and not so desirable--futures.”

The article begins with a quote from Christine Peterson of the Foresight Institute recommending that “hard science fiction” be used to aid in “long-term” (20+ year) prediction scenarios; she advises, “Don’t think of it as literature,” and focus on the technologies rather than the people. Miller and Bennett, however, argue otherwise--that not only is science fiction useful for thinking about longer-term consequences, but that the parts about the people--how technologies actually fit into society--are just as, if not more important than the ideas about the technologies themselves.

It ends with some examples of use of science fiction in workshops for nanotechnology researchers which have been conducted by Bennett and suggested uses in science education and in “society’s practices and institutions for public engagement and technology assessment.” About the former suggested use, the authors write that “The National Science Foundation, which has by and large not been in the business of supporting science fiction, might be encouraged to fund training and/or networking exercises that would foster greater interaction among scientists and fiction writers.”

While some steps have been taken to promote interaction between scientists and fiction writers--most notably the National Academy of Sciences’ Science and Entertainment Exchange project headed by executive director Jennifer Ouellette who spoke at last year’s The Amazing Meeting 7--this interaction is mostly one-way. The project is conceived of as a way for science to be accurately communicated to the general public through entertainment, rather than facilitating the generation of ideas for technological innovation and scientific development from the general public or the entertainment stories that are created. The SEE promotes the idea of collaboration between scientists and entertainment producers on the creative works of entertainment, but not necessarily directing creative feedback into science or building new capacities in science and technology, except indirectly by providing the general public with inspiration about science. Similarly, the Skeptrack and Science Track at the annual Dragon*Con science fiction convention in Atlanta provide ways for scientists and skeptics to interact with science fiction fans (and creators of science fiction works), but the communication is primarily in one direction via speakers and panels, with an opportunity for Q&A. (Unlike the notion of a SkeptiCamp, where all participants are potentially on an equal basis, with everyone given the opportunity to be a presenter.)

[P.S. The Long Now Foundation is an organization that makes the Foresight Institute’s time horizon look short--their time frame is the next 10,000 years, with a focus on how to make extremely long-term projects work and how to create an institutional framework that can persist for extremely long periods of time. (The obligatory science fiction references are Walter M. Miller, Jr.’s A Canticle for Leibowitz and Neal Stephenson’s Anathem.)]

[A slightly different version of the above was written for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Judd A. for his  comments--he raised the concern that SkeptiCamp is connected to a rationalist form of skepticism that is concerned to "narrow the range of 'acceptable' beliefs" rather than widen it.  While this may be true, depending on what the class of "acceptable" beliefs is prior to applying a skeptical filter, it need not be--applying scientific methodology and critical thinking can also open up possibilities for individuals.  And if the initial set of beliefs includes all possibilities, converting that set to knowledge must necessarily involve narrowing rather than expanding the range, as there are many more ways to go wrong than to go right.  But this criticism points out something that I've observed in my comparison of skepticism to Forteanism--skepticism is more concerned about avoiding Type I errors than Type II errors, while Forteans are more concerned about avoiding Type II errors than Type I errors, and these are complementary positions that both need representation in society.]

Thursday, April 22, 2010

Haven't we already been nonmodern?

Being modern, argues Bruno Latour in We Have Never Been Modern (1993, Harvard Univ. Press), involves drawing a sharp distinction between “nature” and “culture,” through a process of “purification” that separates everything into one or the other of these categories. It also involves breaking with the past: “Modernization consists in continually exiting from an obscure age that mingled the needs of society with scientific truth, in order to enter into a new age that will finally distinguish clearly what belongs to atemporal nature and what comes from humans, what depends on things and what belongs to signs” (p. 71).

But hold on a moment--who actually advocates that kind of a sharp division between nature and culture, without acknowledging that human beings and their cultures are themselves a part of the natural order of things? As the 1991 Love and Rockets song, “No New Tale to Tell,” said: “You cannot go against nature / because when you do / go against nature / it’s part of nature, too.” Trying to divide the contents of the universe into a sharp dichotomy often yields a fuzzy edge, if not outright paradox. While Latour is right to object to such a sharp distinction (or separation) and to argue for a recognition that much of the world consists of “hybrids” that include natural and cultural aspects (true of both material objects and ideas), I’m not convinced that he’s correctly diagnosed a genuine malady when he writes that “Moderns ... refuse to conceptualize quasi-objects as such. In their eyes, hybrids present the horror that must be avoided at all costs by a ceaseless, even maniacal purification” (p. 112).

Latour writes that anthropologists do not study modern cultures in the manner that they study premodern cultures. For premoderns, an ethnographer will generate “a single narrative that weaves together the way people regard the heavens and their ancestors, the way they build houses and the way they grow yams or manioc or rice, the way they construct their government and their cosmology,” but that this is not done for modern societies because “our fabric is no longer seamless” (p. 7). True, but the real problem for such ethnography is not that we don’t have such a unified picture of the world (and we don’t) but that we have massive complexity and specialization--a complexity which Latour implicitly recognizes (pp. 100-101) but doesn’t draw out as a reason.

The argument that Latour makes in the book builds upon this initial division of nature and culture by the process of “purification” with a second division between “works of purification” and “works of translation,” “translation” being a four-step process of his advocated framework of actor-network theory that he actually doesn’t discuss much in this book. He proposes that the “modern constitution” contains “works of translation”--networks of hybrid quasi-objects--as a hidden and unrecognized layer that needs to be made explicit in order to be “nonmodern” (p. 138) or “amodern” (p. 90) and avoid the paradoxes of modernity (or other problems of anti-modernity, pre-modernity, and post-modernity).

His attempt to draw the big picture is interesting and often frustrating, as when he makes unargued-for claims that appear to be false, e.g., “as concepts, ‘local’ and ‘global’ work well for surfaces and geometry, but very badly for networks and topology’” (p. 119); “the West may believe that universal gravitation is universal even in the absence of any instrument, any calculation, any decoding, any laboratory ... but these are respectable beliefs that comparative anthropology is no longer obliged to share” (p. 120; also p. 24); speaking of “time” being reversible where he apparently means “change” or perhaps “progress” (p. 73); his putting “universality” and “rationality” on a list of values of moderns to be rejected (p. 135). I’m not sure how it makes sense to deny the possibility of universal generalizations while putting forth a proposed framework for the understanding of everything.

My favorite parts of the book were his recounting of Steven Shapin and Simon Schaffer’s Leviathan and the Air Pump (pp. 15-29) and his critique of that project, and his summary of objections to postmodernism (p. 90). Latour is correct, I think, in his critique that those who try to explain the results of science solely in terms of social factors are making a mistake that privileges “social” over “natural” in the same way that attempting to explain them without any regard to social factors privileges “natural” over “social.” He writes to the postmodernists (p. 90):

“Are you not fed up at finding yourselves forever locked into language alone, or imprisoned in social representations alone, as so many social scientists would like you to be? We want to gain access to things themselves, not only their phenomena. The real is not remote; rather, it is accessible in all the objects mobilized throughout the world. Doesn’t external reality abound right here among us?”

In a commentary on this post, Gretchen G. observed that we do regularly engage in the process of "purification" about our concepts and attitudes towards propositions in order to make day-to-day decisions--and I think she's right.  We do regard things as scientific or not scientific, plausible or not plausible, true or false, even while we recognize that there may be fuzzy edges and indeterminate cases.  And we tend not to like the fuzzy cases, and to want to put them into one category or the other.  In some cases, this may be merely an epistemological problem of our human (and Humean) predicament where there is a fact of the matter; in others, our very categories may themselves be fuzzy and not fit reality ("carve nature at its joints").

[A slightly different version of the above was written for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Gretchen G. for her comments.  An entertaining critique of Latour's earlier book Science in Action is Olga Amsterdamska's "Surely You're Joking, Monsieur Latour!", Science, Technology, and Human Values vol. 15, no. 4 (1990): 495-504.]

Wednesday, April 21, 2010

Matthew LaClair vs. Texas Board of Education

Matthew LaClair, who exposed his proselytizing U.S. history teacher/youth pastor in 2006, now hosts his own radio show, "Equal Time for Freethought," on WBAI 99.5 FM on Sundays at 6:30 p.m. ET in the New York/New Jersey/Connecticut area.  The show is also online via streaming audio.

This coming Sunday, April 25, Matthew will be debating a conservative member of the Texas Board of Education about their recent changes to the curriculum (e.g., removing Thomas Jefferson).

If you happen to miss the show, it will subsequently be available in the online archives.

Tuesday, April 20, 2010

Translating local knowledge into state-legible science

James Scott’s Seeing Like a State (about which I've blogged previously) talks about how the state imposes standards in order to make features legible, countable, regulatable, and taxable. J. Stephen Lansing’s Perfect Order: Recognizing Complexity in Bali describes a case where the reverse happened. When Bali tried to impose a top-down system of scientifically designed order--a system of water management--on Balinese rice farmers, in the name of modernization in the early 1970s, the result was a brief increase in productivity followed by disaster. Rather than lead to more efficient use of water and continued improved crop yields, it produced pest outbreaks which destroyed crops. An investment of $55 million in Romijn gates to control water flow in irrigation canals had the opposite of the intended effect. Farmers removed the gates or lifted them out of the water and left them to rust, upsetting the consultants and officials behind the project. Pesticides delivered to farmers resulted in brown leafhoppers becoming resistant to pesticides, and supplied fertilizers washed into the rivers and killed coral reefs at the mouths of the rivers.

Lansing was part of a team sponsored by the National Science Foundation in 1983 that evaluated the Balinese farmers’ traditional water management system to understand how it worked. The farmers of each village belong to subaks, or organizations that manage rice terraces and irrigation systems, which are referred to in Balinese writings going back at least a thousand years. Lansing notes that “Between them, the village and subak assemblies govern most aspects of a farmer’s social, economic, and spiritual life.”

Lansing’s team found that the Balinese system of water temples, religious ritual, and irrigation managed by the subaks would synchronize fallow periods of contiguous segments of terraces, so that long segments could be kept flooded after harvest, killing pests by depriving them of habitat. But their attempt and that of the farmers to persuade the government to allow the traditional system to continue fell upon deaf ears, and the modernization scheme continued to be pushed.

In 1987, Lansing worked with James Kremer to develop a computer model of the Balinese water temple system, and ran a simulation using historical rainfall data. This translation of the traditional system into scientific explanation showed that the traditional system was more effective than the modernized system, and government officials were persuaded to allow and encourage a return to the traditional system.

The Balinese system of farming is an example of how local knowledge can develop and become embedded in a “premodern” society by mechanisms other than conscious and intentional scientific investigation (in this case, probably more like a form of evolution), and be invisible to the state until it is specifically studied. It’s also a case where the religious aspects of the traditional system may have contributed to its dismissal by the modern experts.

What I find of particular interest here is to what extent the local knowledge was simply embedded into the practices, and not known by any of the participants--were they just doing what they've "always" done (with practices that have evolved over the last 1,000 years), in a circumstance where the system as a whole "knows," but no individual had an understanding until Lansing and Kremer built and tested a model of what they were doing?

[A slightly different version of the above was written for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Brenda T. for her comments.  More on Lansing's work in Bali may be found online here.]

Monday, April 19, 2010

Is the general public really that ignorant? Public understanding of science vs. civic epistemology

Studies of the public understanding of science generally produce results that show a disturbingly high level of ignorance.  When asked to agree or disagree with the statement that “ordinary tomatoes do not contain genes, while genetically modified tomatoes do,” only 36% of Europeans answered correctly in 2002 (and only 35% in 1999 and 1996, Eurobarometer Biotechnology Quiz).  Those in the U.S. did better with this question, with 45% getting it right; Canada and the Netherlands got the highest level of correct answers (52% and 51%, respectively).  Tests of similar statements, such as “Electrons are smaller than atoms,” “The earliest human beings lived at the same time as the dinosaurs,” and “How long does it take the Earth to go around the Sun: one day, one month, or one year,” all yield similarly low levels of correct responses.

Public understanding of science research shows individuals surveyed to be remarkably ignorant of particular facts about science, but is that the right measure of how science is understood and used by the public at large?  Such surveys ask about disconnected facts independent from a context in which they might be used, and measure only an individual’s personal knowledge. If, instead, those surveyed were asked who among their friends would they rely upon to obtain the answer to such a question, or how would they go about finding a reliable answer to the question, the results might prove to be quite different.

Context can be quite important. In the Wason selection task, individuals are shown four cards labeled, respectively, “E”, “K,” “4,” and “7,” and are asked which cards they would need to turn over in order to test the rule, “If a card has a vowel on one side, then it has an even number on the other side.” Test subjects do very well at recognizing that the “E” card needs to be turned over (corresponding to the logical rule of modus ponens), but very poorly at recognizing that the “7,” rather than the “4,” needs to be turned over to find out if the rule holds (i.e., they engage in the fallacy of affirming the consequent rather than use the logical rule of modus tollens). But if, instead of letters and numbers, a scenario with more context is constructed, subjects perform much more reliably. In one variant, subjects were told to imagine that they are post office workers sorting letters, and looking to find those which do not comply with a regulation that requires an additional 10 lire of postage on sealed envelopes. They are then presented with four envelopes (two face down, one opened and one sealed, and two face up, one with a 50-lire stamp and one with a 40-lire stamp) and asked to test the rule “If a letter is sealed, then it has a 50-lire stamp on it.” Subjects then recognize that they need to turn over the sealed face-down envelope and the 40-lire stamped envelope, despite its logical equivalent to the original selection task that they perform poorly on.

Sheila Jasanoff, in Designs on Nature, argues that measures of the public understanding of science are not particularly relevant to how democracies actually use science. Instead, she devotes chapter 10 of her book to an alternative approach, “civic epistemology,” which is a qualitative framework for understanding the methods and practices of a community’s generation and use of knowledge.  She offers six dimensions of civic epistemologies:
(1) the dominant participatory styles of public knowledge-making; (2) the methods of ensuring accountability; (3) the practices of public demonstration; (4) the preferred registers of objectivity; (5) the accepted bases of expertise; and (6) the visibility of expert bodies.  (p. 259)
She offers the following table of comparison on these six dimensions for the U.S., Britain, and Germany:

United States
Contentious
Britain
Communitarian
Germany
Consensus-seeking
1 Pluralist, interest-based Embodied, service-based Corporatist, institution-based
2 Assumptions of distrust; Legal Assumptions of trust; Relational Assumption of trust; Role-based
3 Sociotechnical experiments Empirical science Expert rationality
4 Formal, numerical, reasoned Consultative, negotiated Negotiated, reasoned
5 Professional skills Experience Training, skills, experience
6 Transparent Variable Nontransparent

She argues that this multi-dimensional approach provides a meaningful way of evaluating the courses of scientific policy disputes regarding biotech that she describes in the prior chapters of the book, while simply looking at national data on public understanding of science with regard to those controversies offers little explanation.  The nature of those controversies didn’t involve just disconnected facts, or simple misunderstandings of science, but also involved interests and values expressed through various kinds of political participation.

Public understanding of science surveys do provide an indicator of what individuals know that may be relevant to public policy on education, but it is at best a very indirect and incomplete measure of what is generally accepted in a population, and even less informative about how institutional structures and processes use scientific information.  The social structures in modern democracies are responsive to other values beyond the epistemic, and may in some cases amplify rational or radical ignorance of a population, but they may more frequently moderate and mitigate such ignorance.

Sources:
  • Eurobarometer Biotechnology Quiz results from Jasanoff, Designs on Nature, 2005, Princeton University Press, p. 87.
  • U.S., Canada, Netherlands survey results from Thomas J. Hoban slide in Gary Marchant’s “Law, Science, and Technology” class lecture on public participation in science (Nov. 16, 2009).
  • Wason task description from John R. Anderson, Cognitive Psychology and Its Implications, Second Edition, 1985, W.H. Freeman and Company, pp. 268-269.
[A slightly different version of the above was written as a comment for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Brenda T. for her comments.]

Thursday, April 15, 2010

Winner's techne and politeia, 22 years later

Chapter 3 of Langdon Winner’s The Whale and the Reactor (1988) is titled “Techné and Politeia,” a discussion of the relationship of technology and politics that draws upon Plato, Rousseau, and Thomas Jefferson to recount historical views before turning to the “modern technical constitution.”  The contemporary “interconnected systems of manufacturing, communications, transportation” and so forth that have arisen have a set of five features that Winner says “embody answers to age-old political questions ... about membership, power, authority, order, freedom, and justice” (p. 47).

The five features are (pp. 47-48):
  1. “the ability of technologies of transportation and communication to facilitate control over events from a single center or small number of centers.”
  2. “a tendency for new devices and techniques to increase the most efficient or effective size of organized human associations.”
  3. “the way in which the rational arrangement of socio-technical systems has tended to produce its own distinctive forms of hierarchical authority.”
  4. “the tendence of large, centralized, hierarchically arranged sociotechnical entities to crowd out and eliminate other varieties of human activity.”
  5. “the various ways that large sociotechnical organizations exercise power to control the social and political influences that ostensibly control them.” (e.g., regulatory capture)
Winner states that the adoption of systems with these features implicitly provides answers to political questions without our thinking about it, questions such as “Should power be centralized or dispersed? What is the best size for units of social organization? What constitutes justifiable authority in human associations? Does a free society depend on social uniformity or diversity? What are appropriate structures and processes of public deliberation and decision making?” (p. 49)  Where the founding fathers of the United States considered these questions explicitly in formulating our political constitution, the developers of technological systems--which have become socio-technical systems, with social practices surrounding the use of technology--have typically failed to do so, being more concerned with innovation, profit, and organizational control rather than broader social implications (p. 50).

While there are widely accepted criteria for placing regulatory limits on technology--Winner notes five (threats to health and safety, exhaustion of a vital resource, degrading environmental quality, threats to natural species and wilderness, and causing “social stresses and strains of an exaggerated kind,” pp. 50-51)--he suggests that these are insufficient.  He cites a study by colleagues of electronic funds transfer (EFT) which suggested that it “would make possible a shift of power from smaller banks to larger national and international institutions” and create problems of data protection and individual privacy.  But those problems don’t seem to fall under his five criteria, so he suggested, ironically, that “their research try to show that under conditions of heavy, continued exposure, EFT causes cancer in laboratory animals” (p. 51).  Although I’d be surprised to find that EFT by itself had the effect Winner suggests, the recent global financial crisis as shown problems with allowing financial institutions to become “too big to fail” and motivated financial reform proposals (e.g., Sen. Dodd’s bill that would create new regulatory power over institutions with more than $50 billion in assets, including the ability to force such institutions into liquidation--“death panels” for large financial institutions).

In the 22 years since Winner’s book was published, most of his five features seem to continue to be relevant to developments such as the Internet.  With respect to (2),(3), and (4) the Internet has greatly reduced the costs of organizing and allowed for social (non-market) production of goods.  But the mechanisms which ease the creation of small, geographically dispersed groups have also facilitated the creation of larger groups, new kinds of hierarchical authority, and new kinds of centralization and monitoring (e.g., via applications used by hundreds of millions of people, provided by companies like Google, Facebook, and Twitter).  It’s also allowed for new forms of influence by the same old powers-that-be, via techniques like astroturfing and faux amateur viral videos.

[A slightly different version of the above was written as a comment for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Tim K. for his comments (though I declined to move the paragraph you suggested).]

Wednesday, April 07, 2010

Many Species of Animal Law

Today I went to hear Bruce Wagman speak on the subject of "Many Species of Animal Law" at ASU's Sandra Day O'Connor College of Law.  Wagman, an attorney with Schiff Hardin who is also an outside litigator for the Animal Legal Defense Fund, has litigated cases involving animals for 18 years, written a case book on animal law, and teaches animal law courses at several law schools as an adjunct faculty member.  He was introduced by ASU Law School Dean Paul Berman and Arizona Court of Appeals Judge Pat Norris.

Wagman began by defining "animal law" as any law where the status of an animal matters--psychological, biological, welfare, etc. status of the animal, as opposed to its value as property.  He suggested that animal law attorneys "may be the only lawyers on earth whose clients are all innocent."

He divided his talk up into multiple "species" of animal law.

Species 1: Companion Animal Issues

He said this makes up the majority of his cases, and includes injuries by or to animals, including veterinary malpractice.  The challenge is to get courts to recognize that animals are not merely property, since historically companion animals have been viewed as property with low or even zero market value.  In cases where an animal is injured or killed, the market value doesn't recognize the interests of the animal or other kinds of value that companion animals give.  Under the American Law Institute's Restatements of the Law, however, there is a notion of "special property" (or "peculiar property" in California's statutes) which allows quantification of other kinds of worth to an animal owner, for instance if the animal is a therapy dog.  There are no emotional stress damages available.

Other sorts of companion animal cases include custody disputes, which often occur as a result of one partner just trying to inflict distress on another rather than having actual interest in the animal.  Wagman said that courts are beginning to take a better look at the interests of the animal in such cases, and be willing to appoint a guardian ad litem, as occurred in the Michael Vick case and in another case in Tennessee where there was a dispute over custody of a dog between a dead man's girlfriend and parents.

There are dangerous dog issues, where an attorney may be fighting against the classification of a dog as a dangerous or vicious animal, or against its euthanasia--what he called "capital cases" for animals.  In three counties surrounding San Francisco, what happens in the case of a dog biting another dog that requires stitches varies dramatically.  In one county, the dog gets a period of probation.  In another, the dog gets labeled as a dangerous or vicious dog, which requires the owner to meet various conditions of housing the dog, having a certain height of fence, carry additional insurance, and so forth.  And in Santa Clara County, the dog gets euthanized.  He pointed out that that county's statute has an exemption for "mitigating circumstances" which he's successfully used to prevent dogs from being euthanized.

Finally, there are wills and trusts--he said he doesn't do that sort of work, but that 48 states now have mechanisms for having trusts for animals.

He said he considers companion animals to be a sort of "gateway animal" for getting recognition of animals in the law, and noted that we tend to be "speciesists" who would feel very different about snakes vs. Labrador Retrievers.  [IMO, this is rational to the extent that animals differ in cognitive capacities, and I note that at no point did he discuss litigating on behalf of cockroaches against pest control companies.]

Species 2: Farm animal issues--legislation and litigation.

His second species of animal law was about animals killed for food--about 10 billion per year in the United States.  He said the goal here is not to stop the killing, but just to improve the living conditions of animals before they're killed for food.  This is problematic, however, because the animal cruelty statutes are criminal rather than civil (with an exception in North Carolina that will be discussed with regard to Species 3 of animal law), and that the criminal law for animal cruelty excludes farm animals in 35 states.  He discussed a few of the more abusive methods of animal treatment in factory farming--calf crates, in which calves are placed for about the first 60 days of life, gestation crates for pigs (outlawed in Arizona since 2006, as well as illegal in Florida, Oregon, Colorado, and California), and battery cages for chickens.

He also discussed downer animals--animals which are either so seriously injured or ill that they are unable to move, which the meat industry wants to drag in that condition to slaughter.  Wagman raised the concern that such animals, if sick, could potentially spread illness to humans, and listed a bunch of diseases that could potentially so spread, with BSE (mad cow) at the top of the list along with avian flu.  Of these, only BSE has been documented to spread to humans, and the industry position is that there should be no restrictions on downer pigs unless and until a human actually gets sick.  The state of California passed a law that said that all downer animals must be euthanized on the spot; the meat industry sued and overturned the statute in federal district court, but the 9th Circuit just reversed it last week (National Meat Association v. Brown).

Species 3: Animal hoarding--private ownership, breeders, and the sanctuary that is not

Wagman said that there have been 250,000 documented cases of animal hoarding, and that they are difficult cases to work with in multiple ways.  He said he believes such cases involve mental illness, but while the APA has a diagnosis for "hoarding" behavior, it excludes animal hoarding which is considered to be different.  How many animals constitutes hoarding?  He said he likes to say "more than eight," because he has eight animals at home.  Hoarders characteristics include possessing more animals than they can care for, having a sense of being persecuted, and living in deplorable conditions.

He discussed two cases that he litigated, ALDF v. Barbara & Robert Woodley, and ALDF v. Janie Conyers, which involved over 500 animals between them.  The former case, in North Carolina, was able to use North Carolina statute 19a, which allows a civil cause of action for animal cruelty.  Wagman had some horrifying photos from the Woodley case.  They had hundreds of dogs in their home living in their own feces, where ammonia levels were 20 times the USDA maximum allowed in a pig facility.  These ammonia levels caused blindness in the dogs, as well as chemical burns to bare skin that contacted the floor, such as dogs' scrotums.  Multiple dogs were kept in wooden boxes with lids on them, and never let out.  Mrs. Woodley's favorite dog, Buddy, not only had his eyes burned to blindness from ammonia, but the bone in the dog's jaw deteriorated from malnutrition.  Local officials had known of Woodley's problem for 20 years, but considered themselves powerless to do anything about it, since the scale of the problem was so large--the local shelter had only eight kennels, while the Woodleys had about 450 animals.  The ALDF had to coordinate a massive effort to manage the rescue of the animals through their case.

Conyers was an AKC poodle breeder who had 106 poodles living in their own feces.

Wagman said that animal psychological suffering is difficult to show, but it can be done; demonstrating physiological suffering is easier, with objective criteria like the ammonia levels and physical injuries to animals.

There is no law against hoarding (except in Hawaii), just the criminal abuse statutes (and civil in NC).  In the hoarding cases the abuse is typically neglect rather than active abuse.

Species 4: Exotic animal ownership

Wagman has handled about 10 chimpanzee cases.  One was a case involving a couple in West Covina, California who had a chimp named Moe for 35 years that bit two people.  He argued for a guardian ad litem to determine what was in the best interests of the chimp, and arranged to get Jane Goodall and Roger Fouts for that role.  The court looked upon it favorably, but the couple came to an out-of-court settlement.

He also briefly discussed the Stamford, Connecticut case of Travis, the 200-pound chimpanzee who attacked a woman that was in the news last year.

He argued that there should be a legislative fix to ban exotic animal ownership completely--they're wild animals.  [A complete ban seems to me too much--there should be exceptions for research, conservation, breeding programs for endangered species, and so forth.  And shouldn't it be possible to domesticate other wild animals?]  Connecticut has taken the step of banning chimp ownership.

Species 5: Shelter practices - euthanasia, veterinary care, adequate food, water, and sanitation, and hold periods

Animal shelters have an overwhelming job, said Wagman.  The County of Los Angeles, which he sued, operates seven shelters which handle tens of thousands of animals per year.  California law says that all animals must get veterinary care and be held for five days, and allowing animal suffering without treatment is not permissible.  The shelters' own records showed that they weren't meeting that standard for thousands of animals, but they're now working to meet them and having their activity monitored for compliance.  A similar set of cases occurred in Kentucky, when the state transferred all shelter responsibility to the counties.  Although the standards of care were minimal, they weren't meeting it, and there were nutrition, veterinary care, and euthanasia issues.  Upon getting notice, they quickly took action to remedy.

In Georgia, by contrast, there is a statute that prohibits the use of gas chambers for euthanization at shelters, but the Commissioner of Agriculture sent out letters to the shelters asking that they purchase gas chambers for euthanization.  Gas chambers apparently have very ugly results in some cases, such as with unhealthy dogs.  A lawsuit against the state of Georgia for its failure to comply with its own statute resulted an an injunction, which they then immediately violated by sending out more letters asking for gas chamber purchases.  After obtaining a contempt ruling from the court, they finally got compliance.

Species 6: Entertainment

Wagman called this category both the most obvious and the most hidden.  The use of animals in entertainment is obvious, but what is not obvious is what goes on behind the scenes, the knowledge of which drains the fun out of the entertainment.

Circuses, zoos, film and TV ads, animal fighting, public appearances, racing and rodeos, and hunting and fishing are all cases of animals used for entertainment.  Wagman first discussed elephants in circuses, commenting on a recent Ringling Brothers case which was tossed out on an issue of standing.  The case involved the use of bullhooks for elephant training, which injures the animals.  The defense didn't deny use of bullhooks, but claimed that they only use them as "guides."

Elephant treatment in zoos is also problematic, since standing around on hard surfaces causes painful arthritis.  In the wild, elephants are awake 21 hours a day and may move 35 miles per day.

Wagman discussed dog fighting, and said that the Michael Vick case was a wakeup call for America to the reality of dog fighting, which exists in every state and most major cities.

He argued that the use of great apes in film and television should be banned, because of how the training process works.  He said that while trainers claim to use only positive reinforcement training, an undercover person who volunteered for a year and a half with trainer Sid Yost found otherwise.  A young chimpanzee is immediately treated to beating and punching to get them to comply.  Their performance lifetime is about 3-5 years, after which they become to strong to conrol, and end up in private homes, in research, or in zoos, often all alone in barren cases.  Wagman pointed out that the common use of a "smiling" chimpanzee is actually a fear grimace.  He does lots of work for sanctuaries, of which there are nine in the U.S. for chimpanzees (including chimpsanctuarynw.org).

Regarding hunting, he distinguished traditional hunting from canned hunting and Internet hunting.  Hunting is protected in most states, including in many state constitutions.  Canned hunting ranches, where animals are fed by hand by humans before they are flushed out into open areas to be shot, are not considered to be hunting by most traditional hunters.  [But is considered hunting by our former Vice President, Dick Cheney.]  Internet hunting, where a rifle can be fired at live animals over the Internet, has been banned in 30 states.

He mentioned mountain lion hunting in the Black Hills of South Dakota, where mountain lions have become fairly scarce.  A lawsuit was filed to try to stop the hunting on grounds of near-extinction of the animals, but the injunction was denied on the grounds that there were unlikely to be any mountain lions even found and killed.  Two mountain lions were killed shortly thereafter in fairly quick succession, and even though there was a law that prohibited killing female mountain lions with cubs, the second one killed had a cub, and there was no prosecution.

Some Adidas shoes are made with kangaroo skin, and the state of California has banned the importation of kangaroo skin, which Adidas ignored.  Adidas was sued as a result, and they lost at the California Supreme Court--but they responded by persuading the legislature to repeal the ban rather than changing their practices.

Species 7: Species and breed-specific legislation and ADA breedism case.

A variety of dog breeds have been considered at various times and places to be "bad dogs" that create a special danger.  After WWII, it was German Shepherds and Dobermans.  All cases to stop such breed-specific legislation have failed, because the "rational relation" standard is met by only a single case of harm.  A case in progress right now in Concord, California involves Theresa Huerta, a woman suing under the Americans with Disabilities Act to keep her pit bull therapy dog from being euthanized.

Wagman concluded by saying that his overall objective is to keep the public and the courts focused on the real issue, which is ending blatant cases of animal abuse.  Animal law is a growing field, and there's an annual animal law conference in Portland that's now in its fifth year.

Tuesday, April 06, 2010

First two stray dogs of 2010

I caught these two male dogs in the front yard this afternoon--they wandered in while the gate was open, and I closed it to catch them.  No collars, no tags, and the pit mix was unneutered (didn't check the Spitz mix or whatever he is).  At first they were very skittish, but after they finally approached me, both wanted my constant attention.  They were both quickly picked up by the Maricopa County pound--I'm sure they'll get taken to the east side.

As I was closing the gate to catch these guys, I heard a car honk its horn and a dog yelp, and looked up to see the car drive away as a man, woman, and dog stood on the sidewalk, the dog limping.  I asked the man if the dog had just been hit, and if it was his dog, and he answered yes to both.  They walked off, the dog limping (and off leash, with no collar or tags).

Please, if you own animals, be a responsible pet owner.

Against "coloring book" history of science

It's a bad misconception about evolution that it proceeds in a linear progression of one successfully evolving species after another displacing its immediate ancestors.  Such a conception of human history is equally mistaken, and is often criticized with terms such as "Whiggish history" or "determinism" with a variety of adjectives (technological, social, cultural, historical).  That includes the history of science, where the first version we often hear is one that has been rationally reconstructed by looking back at the successes and putting them into a linear narrative.  Oh, there are usually a few errors thrown in, but they're usually fit into the linear narrative as challenges that are overcome by the improvement of theories.

The reality is a lot messier, and getting into the details makes it clear that not only is a Whiggish history of science mistaken, but that science doesn't proceed through the algorithmic application of "the scientific method," and in fact that there is no such thing as "the scientific method."  Rather, there is a diverse set of methods that are themselves evolving in various ways, and sometimes not only do methods which are fully endorsed as rational and scientific produce erroneous results, sometimes methods which have no such endorsement and are even demonstrably irrational fortuitously produce correct results.  For example, Johannes Kepler was a neo-pythagorean number mystic who correctly produced his second law of planetary motion by taking an incorrect version of the law based on his intuitions and deriving the correct version from it by way of a mathematical argument that contained an error.  Although he fortuitously got the right answer and receives credit for devising it, he was not justified in believing it to be true on the basis of his erroneous proof.  With his first law, by contrast, he followed an almost perfectly textbook version of the hypothetico-deductive model of scientific method of formulating hypotheses and testing them against Tycho Brahe's data.

The history of the scientific revolution includes numerous instances of new developments occurring piecemeal, with many prior erroneous notions being retained.  Copernicus retained not only perfectly circular orbits and celestial spheres, but still needed to add epicycles to get his theory any where close to the predictive accuracy of the Ptolemaic models in use.  Galileo insisted on retaining perfect circles and insisting that circular motion was natural motion, refusing to consider Kepler's elliptical orbits.  There seems to be a good case for "path dependence" in science.  Even the most revolutionary changes are actually building on bits and pieces that have come before--and sometimes rediscovering work that had already been done before, like Galileo's derivation of the uniform acceleration of falling bodies that had already been done by Nicole Oresme and the Oxford calculators.  And the social and cultural environment--not just the scientific history--has an effect on what kinds of hypotheses are considered and accepted.

This conservativity of scientific change is a double-edged sword.  On the one hand, it suggests that we're not likely to see claims that purport to radically overthrow existing theory (that "everything we know is wrong") succeed--even if they happen to be correct.  And given that there are many more ways to go wrong than to go right, such radical revisions are very likely not to be correct.  Even where new theories are correct in some of their more radical claims (e.g., like Copernicus' heliocentric model, or Wegener's continental drift), it often requires other pieces to fall into place before they become accepted (and before it becomes rational to accept them).  On the other hand, this also means that we're likely to be blinded to new possibilities by what we already accept that seems to work well enough, even though it may be an inaccurate description of the world that is merely predictively successful.  "Consensus science" at any given time probably includes lots of claims that aren't true.

My inference from this is that we need both visionaries and skeptics, and a division of cognitive labor that's largely conservative, but with tolerance for diversity and a few radicals generating the crazy hypotheses that may turn out to be true.  The critique of evidence-based medicine made by Kimball Atwood and Steven Novella--that it fails to consider prior plausibility of hypotheses to be tested--is a good one that recognizes the unlikelihood of radical hypotheses to be correct, and thus that huge amounts of money shouldn't be spent to generate and test them.  (Their point is actually stronger than that, since most of the "radical hypotheses" in question are not really radical or novel, but are based on already discredited views of how the world works.)  But that critique shouldn't be taken to exclude anyone from engaging in the generation and test of hypotheses that don't appear to have a plausible mechanism, because there is ample precedent for new phenomena being discovered before the mechanisms that explain them.

I think there's a tendency among skeptics to talk about science as though it's a unified discipline, with a singular methodology, that makes continuous progress, and where the consensus at any moment is the most appropriate thing to believe.  The history of science suggests, on the other hand, that it's composed of multiple disciplines, with multiple methods, that proceeds in fits and starts, that has dead-ends, that sometimes rediscovers correct-but-ignored past discoveries, and is both fallible and influenced by cultural context.  At any given time, some theories are not only well-established but unified well with others across disciplines, while others don't fit comfortably well with others, or may be idealized models that have predictive efficacy but seem unlikely to be accurate descriptions of reality in their details.  To insist on an overly rationalistic and ahistorical model is not just out-of-date history and philosophy of science, it's a "coloring book" oversimplification.  While that may be useful for introducing ideas about science to children, it's not something we should continue to hold to as adults.