Thursday, April 29, 2010

Science fiction scenarios and public engagement with science

Science fiction has been a popular genre at least since Jules Verne’s 19th century work, and arguably longer still. But can it have practical value as well as be a form of escapist entertainment? Clark Miller and Ira Bennett of ASU suggest that it has potential for use in improving the capacity of the general public “to imagine and reason critically about technological futures” and for being integrated into technology assessment processes (“Thinking longer term about technology: is there value in science fiction-inspired approaches to constructing futures?" Science and Public Policy 35(8), October 2008, pp. 597-606).

Miller and Bennett argue that science fiction can provide a way to stimulate people to wake from “technological somnambulism” (Langdon Winner’s term for taking for granted or being oblivious to sociotechnical changes), in order to recognize such changes, realize that there may be alternative possibilities and that particular changes need not be determined, and to engage with deliberative processes and institutions that choose directions of change. Where most political planning is short-term and based on projections that simply extend current trends incrementally into the future, science fiction provides scenarios which exhibit “non-linearity” by involving multiple, major, and complex changes from current reality. While these scenarios “likely provide...little technical accuracy” about how technology and society will actually interact, they may still provide ideas about alternative possibilities, and in particular to provide “clear visions of desirable--and not so desirable--futures.”

The article begins with a quote from Christine Peterson of the Foresight Institute recommending that “hard science fiction” be used to aid in “long-term” (20+ year) prediction scenarios; she advises, “Don’t think of it as literature,” and focus on the technologies rather than the people. Miller and Bennett, however, argue otherwise--that not only is science fiction useful for thinking about longer-term consequences, but that the parts about the people--how technologies actually fit into society--are just as, if not more important than the ideas about the technologies themselves.

It ends with some examples of use of science fiction in workshops for nanotechnology researchers which have been conducted by Bennett and suggested uses in science education and in “society’s practices and institutions for public engagement and technology assessment.” About the former suggested use, the authors write that “The National Science Foundation, which has by and large not been in the business of supporting science fiction, might be encouraged to fund training and/or networking exercises that would foster greater interaction among scientists and fiction writers.”

While some steps have been taken to promote interaction between scientists and fiction writers--most notably the National Academy of Sciences’ Science and Entertainment Exchange project headed by executive director Jennifer Ouellette who spoke at last year’s The Amazing Meeting 7--this interaction is mostly one-way. The project is conceived of as a way for science to be accurately communicated to the general public through entertainment, rather than facilitating the generation of ideas for technological innovation and scientific development from the general public or the entertainment stories that are created. The SEE promotes the idea of collaboration between scientists and entertainment producers on the creative works of entertainment, but not necessarily directing creative feedback into science or building new capacities in science and technology, except indirectly by providing the general public with inspiration about science. Similarly, the Skeptrack and Science Track at the annual Dragon*Con science fiction convention in Atlanta provide ways for scientists and skeptics to interact with science fiction fans (and creators of science fiction works), but the communication is primarily in one direction via speakers and panels, with an opportunity for Q&A. (Unlike the notion of a SkeptiCamp, where all participants are potentially on an equal basis, with everyone given the opportunity to be a presenter.)

[P.S. The Long Now Foundation is an organization that makes the Foresight Institute’s time horizon look short--their time frame is the next 10,000 years, with a focus on how to make extremely long-term projects work and how to create an institutional framework that can persist for extremely long periods of time. (The obligatory science fiction references are Walter M. Miller, Jr.’s A Canticle for Leibowitz and Neal Stephenson’s Anathem.)]

[A slightly different version of the above was written for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Judd A. for his  comments--he raised the concern that SkeptiCamp is connected to a rationalist form of skepticism that is concerned to "narrow the range of 'acceptable' beliefs" rather than widen it.  While this may be true, depending on what the class of "acceptable" beliefs is prior to applying a skeptical filter, it need not be--applying scientific methodology and critical thinking can also open up possibilities for individuals.  And if the initial set of beliefs includes all possibilities, converting that set to knowledge must necessarily involve narrowing rather than expanding the range, as there are many more ways to go wrong than to go right.  But this criticism points out something that I've observed in my comparison of skepticism to Forteanism--skepticism is more concerned about avoiding Type I errors than Type II errors, while Forteans are more concerned about avoiding Type II errors than Type I errors, and these are complementary positions that both need representation in society.]

Thursday, April 22, 2010

Haven't we already been nonmodern?

Being modern, argues Bruno Latour in We Have Never Been Modern (1993, Harvard Univ. Press), involves drawing a sharp distinction between “nature” and “culture,” through a process of “purification” that separates everything into one or the other of these categories. It also involves breaking with the past: “Modernization consists in continually exiting from an obscure age that mingled the needs of society with scientific truth, in order to enter into a new age that will finally distinguish clearly what belongs to atemporal nature and what comes from humans, what depends on things and what belongs to signs” (p. 71).

But hold on a moment--who actually advocates that kind of a sharp division between nature and culture, without acknowledging that human beings and their cultures are themselves a part of the natural order of things? As the 1991 Love and Rockets song, “No New Tale to Tell,” said: “You cannot go against nature / because when you do / go against nature / it’s part of nature, too.” Trying to divide the contents of the universe into a sharp dichotomy often yields a fuzzy edge, if not outright paradox. While Latour is right to object to such a sharp distinction (or separation) and to argue for a recognition that much of the world consists of “hybrids” that include natural and cultural aspects (true of both material objects and ideas), I’m not convinced that he’s correctly diagnosed a genuine malady when he writes that “Moderns ... refuse to conceptualize quasi-objects as such. In their eyes, hybrids present the horror that must be avoided at all costs by a ceaseless, even maniacal purification” (p. 112).

Latour writes that anthropologists do not study modern cultures in the manner that they study premodern cultures. For premoderns, an ethnographer will generate “a single narrative that weaves together the way people regard the heavens and their ancestors, the way they build houses and the way they grow yams or manioc or rice, the way they construct their government and their cosmology,” but that this is not done for modern societies because “our fabric is no longer seamless” (p. 7). True, but the real problem for such ethnography is not that we don’t have such a unified picture of the world (and we don’t) but that we have massive complexity and specialization--a complexity which Latour implicitly recognizes (pp. 100-101) but doesn’t draw out as a reason.

The argument that Latour makes in the book builds upon this initial division of nature and culture by the process of “purification” with a second division between “works of purification” and “works of translation,” “translation” being a four-step process of his advocated framework of actor-network theory that he actually doesn’t discuss much in this book. He proposes that the “modern constitution” contains “works of translation”--networks of hybrid quasi-objects--as a hidden and unrecognized layer that needs to be made explicit in order to be “nonmodern” (p. 138) or “amodern” (p. 90) and avoid the paradoxes of modernity (or other problems of anti-modernity, pre-modernity, and post-modernity).

His attempt to draw the big picture is interesting and often frustrating, as when he makes unargued-for claims that appear to be false, e.g., “as concepts, ‘local’ and ‘global’ work well for surfaces and geometry, but very badly for networks and topology’” (p. 119); “the West may believe that universal gravitation is universal even in the absence of any instrument, any calculation, any decoding, any laboratory ... but these are respectable beliefs that comparative anthropology is no longer obliged to share” (p. 120; also p. 24); speaking of “time” being reversible where he apparently means “change” or perhaps “progress” (p. 73); his putting “universality” and “rationality” on a list of values of moderns to be rejected (p. 135). I’m not sure how it makes sense to deny the possibility of universal generalizations while putting forth a proposed framework for the understanding of everything.

My favorite parts of the book were his recounting of Steven Shapin and Simon Schaffer’s Leviathan and the Air Pump (pp. 15-29) and his critique of that project, and his summary of objections to postmodernism (p. 90). Latour is correct, I think, in his critique that those who try to explain the results of science solely in terms of social factors are making a mistake that privileges “social” over “natural” in the same way that attempting to explain them without any regard to social factors privileges “natural” over “social.” He writes to the postmodernists (p. 90):

“Are you not fed up at finding yourselves forever locked into language alone, or imprisoned in social representations alone, as so many social scientists would like you to be? We want to gain access to things themselves, not only their phenomena. The real is not remote; rather, it is accessible in all the objects mobilized throughout the world. Doesn’t external reality abound right here among us?”

In a commentary on this post, Gretchen G. observed that we do regularly engage in the process of "purification" about our concepts and attitudes towards propositions in order to make day-to-day decisions--and I think she's right.  We do regard things as scientific or not scientific, plausible or not plausible, true or false, even while we recognize that there may be fuzzy edges and indeterminate cases.  And we tend not to like the fuzzy cases, and to want to put them into one category or the other.  In some cases, this may be merely an epistemological problem of our human (and Humean) predicament where there is a fact of the matter; in others, our very categories may themselves be fuzzy and not fit reality ("carve nature at its joints").

[A slightly different version of the above was written for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Gretchen G. for her comments.  An entertaining critique of Latour's earlier book Science in Action is Olga Amsterdamska's "Surely You're Joking, Monsieur Latour!", Science, Technology, and Human Values vol. 15, no. 4 (1990): 495-504.]

Wednesday, April 21, 2010

Matthew LaClair vs. Texas Board of Education

Matthew LaClair, who exposed his proselytizing U.S. history teacher/youth pastor in 2006, now hosts his own radio show, "Equal Time for Freethought," on WBAI 99.5 FM on Sundays at 6:30 p.m. ET in the New York/New Jersey/Connecticut area.  The show is also online via streaming audio.

This coming Sunday, April 25, Matthew will be debating a conservative member of the Texas Board of Education about their recent changes to the curriculum (e.g., removing Thomas Jefferson).

If you happen to miss the show, it will subsequently be available in the online archives.

Tuesday, April 20, 2010

Translating local knowledge into state-legible science

James Scott’s Seeing Like a State (about which I've blogged previously) talks about how the state imposes standards in order to make features legible, countable, regulatable, and taxable. J. Stephen Lansing’s Perfect Order: Recognizing Complexity in Bali describes a case where the reverse happened. When Bali tried to impose a top-down system of scientifically designed order--a system of water management--on Balinese rice farmers, in the name of modernization in the early 1970s, the result was a brief increase in productivity followed by disaster. Rather than lead to more efficient use of water and continued improved crop yields, it produced pest outbreaks which destroyed crops. An investment of $55 million in Romijn gates to control water flow in irrigation canals had the opposite of the intended effect. Farmers removed the gates or lifted them out of the water and left them to rust, upsetting the consultants and officials behind the project. Pesticides delivered to farmers resulted in brown leafhoppers becoming resistant to pesticides, and supplied fertilizers washed into the rivers and killed coral reefs at the mouths of the rivers.

Lansing was part of a team sponsored by the National Science Foundation in 1983 that evaluated the Balinese farmers’ traditional water management system to understand how it worked. The farmers of each village belong to subaks, or organizations that manage rice terraces and irrigation systems, which are referred to in Balinese writings going back at least a thousand years. Lansing notes that “Between them, the village and subak assemblies govern most aspects of a farmer’s social, economic, and spiritual life.”

Lansing’s team found that the Balinese system of water temples, religious ritual, and irrigation managed by the subaks would synchronize fallow periods of contiguous segments of terraces, so that long segments could be kept flooded after harvest, killing pests by depriving them of habitat. But their attempt and that of the farmers to persuade the government to allow the traditional system to continue fell upon deaf ears, and the modernization scheme continued to be pushed.

In 1987, Lansing worked with James Kremer to develop a computer model of the Balinese water temple system, and ran a simulation using historical rainfall data. This translation of the traditional system into scientific explanation showed that the traditional system was more effective than the modernized system, and government officials were persuaded to allow and encourage a return to the traditional system.

The Balinese system of farming is an example of how local knowledge can develop and become embedded in a “premodern” society by mechanisms other than conscious and intentional scientific investigation (in this case, probably more like a form of evolution), and be invisible to the state until it is specifically studied. It’s also a case where the religious aspects of the traditional system may have contributed to its dismissal by the modern experts.

What I find of particular interest here is to what extent the local knowledge was simply embedded into the practices, and not known by any of the participants--were they just doing what they've "always" done (with practices that have evolved over the last 1,000 years), in a circumstance where the system as a whole "knows," but no individual had an understanding until Lansing and Kremer built and tested a model of what they were doing?

[A slightly different version of the above was written for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Brenda T. for her comments.  More on Lansing's work in Bali may be found online here.]

Monday, April 19, 2010

Is the general public really that ignorant? Public understanding of science vs. civic epistemology

Studies of the public understanding of science generally produce results that show a disturbingly high level of ignorance.  When asked to agree or disagree with the statement that “ordinary tomatoes do not contain genes, while genetically modified tomatoes do,” only 36% of Europeans answered correctly in 2002 (and only 35% in 1999 and 1996, Eurobarometer Biotechnology Quiz).  Those in the U.S. did better with this question, with 45% getting it right; Canada and the Netherlands got the highest level of correct answers (52% and 51%, respectively).  Tests of similar statements, such as “Electrons are smaller than atoms,” “The earliest human beings lived at the same time as the dinosaurs,” and “How long does it take the Earth to go around the Sun: one day, one month, or one year,” all yield similarly low levels of correct responses.

Public understanding of science research shows individuals surveyed to be remarkably ignorant of particular facts about science, but is that the right measure of how science is understood and used by the public at large?  Such surveys ask about disconnected facts independent from a context in which they might be used, and measure only an individual’s personal knowledge. If, instead, those surveyed were asked who among their friends would they rely upon to obtain the answer to such a question, or how would they go about finding a reliable answer to the question, the results might prove to be quite different.

Context can be quite important. In the Wason selection task, individuals are shown four cards labeled, respectively, “E”, “K,” “4,” and “7,” and are asked which cards they would need to turn over in order to test the rule, “If a card has a vowel on one side, then it has an even number on the other side.” Test subjects do very well at recognizing that the “E” card needs to be turned over (corresponding to the logical rule of modus ponens), but very poorly at recognizing that the “7,” rather than the “4,” needs to be turned over to find out if the rule holds (i.e., they engage in the fallacy of affirming the consequent rather than use the logical rule of modus tollens). But if, instead of letters and numbers, a scenario with more context is constructed, subjects perform much more reliably. In one variant, subjects were told to imagine that they are post office workers sorting letters, and looking to find those which do not comply with a regulation that requires an additional 10 lire of postage on sealed envelopes. They are then presented with four envelopes (two face down, one opened and one sealed, and two face up, one with a 50-lire stamp and one with a 40-lire stamp) and asked to test the rule “If a letter is sealed, then it has a 50-lire stamp on it.” Subjects then recognize that they need to turn over the sealed face-down envelope and the 40-lire stamped envelope, despite its logical equivalent to the original selection task that they perform poorly on.

Sheila Jasanoff, in Designs on Nature, argues that measures of the public understanding of science are not particularly relevant to how democracies actually use science. Instead, she devotes chapter 10 of her book to an alternative approach, “civic epistemology,” which is a qualitative framework for understanding the methods and practices of a community’s generation and use of knowledge.  She offers six dimensions of civic epistemologies:
(1) the dominant participatory styles of public knowledge-making; (2) the methods of ensuring accountability; (3) the practices of public demonstration; (4) the preferred registers of objectivity; (5) the accepted bases of expertise; and (6) the visibility of expert bodies.  (p. 259)
She offers the following table of comparison on these six dimensions for the U.S., Britain, and Germany:

United States
Contentious
Britain
Communitarian
Germany
Consensus-seeking
1 Pluralist, interest-based Embodied, service-based Corporatist, institution-based
2 Assumptions of distrust; Legal Assumptions of trust; Relational Assumption of trust; Role-based
3 Sociotechnical experiments Empirical science Expert rationality
4 Formal, numerical, reasoned Consultative, negotiated Negotiated, reasoned
5 Professional skills Experience Training, skills, experience
6 Transparent Variable Nontransparent

She argues that this multi-dimensional approach provides a meaningful way of evaluating the courses of scientific policy disputes regarding biotech that she describes in the prior chapters of the book, while simply looking at national data on public understanding of science with regard to those controversies offers little explanation.  The nature of those controversies didn’t involve just disconnected facts, or simple misunderstandings of science, but also involved interests and values expressed through various kinds of political participation.

Public understanding of science surveys do provide an indicator of what individuals know that may be relevant to public policy on education, but it is at best a very indirect and incomplete measure of what is generally accepted in a population, and even less informative about how institutional structures and processes use scientific information.  The social structures in modern democracies are responsive to other values beyond the epistemic, and may in some cases amplify rational or radical ignorance of a population, but they may more frequently moderate and mitigate such ignorance.

Sources:
  • Eurobarometer Biotechnology Quiz results from Jasanoff, Designs on Nature, 2005, Princeton University Press, p. 87.
  • U.S., Canada, Netherlands survey results from Thomas J. Hoban slide in Gary Marchant’s “Law, Science, and Technology” class lecture on public participation in science (Nov. 16, 2009).
  • Wason task description from John R. Anderson, Cognitive Psychology and Its Implications, Second Edition, 1985, W.H. Freeman and Company, pp. 268-269.
[A slightly different version of the above was written as a comment for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Brenda T. for her comments.]

Thursday, April 15, 2010

Winner's techne and politeia, 22 years later

Chapter 3 of Langdon Winner’s The Whale and the Reactor (1988) is titled “Techné and Politeia,” a discussion of the relationship of technology and politics that draws upon Plato, Rousseau, and Thomas Jefferson to recount historical views before turning to the “modern technical constitution.”  The contemporary “interconnected systems of manufacturing, communications, transportation” and so forth that have arisen have a set of five features that Winner says “embody answers to age-old political questions ... about membership, power, authority, order, freedom, and justice” (p. 47).

The five features are (pp. 47-48):
  1. “the ability of technologies of transportation and communication to facilitate control over events from a single center or small number of centers.”
  2. “a tendency for new devices and techniques to increase the most efficient or effective size of organized human associations.”
  3. “the way in which the rational arrangement of socio-technical systems has tended to produce its own distinctive forms of hierarchical authority.”
  4. “the tendence of large, centralized, hierarchically arranged sociotechnical entities to crowd out and eliminate other varieties of human activity.”
  5. “the various ways that large sociotechnical organizations exercise power to control the social and political influences that ostensibly control them.” (e.g., regulatory capture)
Winner states that the adoption of systems with these features implicitly provides answers to political questions without our thinking about it, questions such as “Should power be centralized or dispersed? What is the best size for units of social organization? What constitutes justifiable authority in human associations? Does a free society depend on social uniformity or diversity? What are appropriate structures and processes of public deliberation and decision making?” (p. 49)  Where the founding fathers of the United States considered these questions explicitly in formulating our political constitution, the developers of technological systems--which have become socio-technical systems, with social practices surrounding the use of technology--have typically failed to do so, being more concerned with innovation, profit, and organizational control rather than broader social implications (p. 50).

While there are widely accepted criteria for placing regulatory limits on technology--Winner notes five (threats to health and safety, exhaustion of a vital resource, degrading environmental quality, threats to natural species and wilderness, and causing “social stresses and strains of an exaggerated kind,” pp. 50-51)--he suggests that these are insufficient.  He cites a study by colleagues of electronic funds transfer (EFT) which suggested that it “would make possible a shift of power from smaller banks to larger national and international institutions” and create problems of data protection and individual privacy.  But those problems don’t seem to fall under his five criteria, so he suggested, ironically, that “their research try to show that under conditions of heavy, continued exposure, EFT causes cancer in laboratory animals” (p. 51).  Although I’d be surprised to find that EFT by itself had the effect Winner suggests, the recent global financial crisis as shown problems with allowing financial institutions to become “too big to fail” and motivated financial reform proposals (e.g., Sen. Dodd’s bill that would create new regulatory power over institutions with more than $50 billion in assets, including the ability to force such institutions into liquidation--“death panels” for large financial institutions).

In the 22 years since Winner’s book was published, most of his five features seem to continue to be relevant to developments such as the Internet.  With respect to (2),(3), and (4) the Internet has greatly reduced the costs of organizing and allowed for social (non-market) production of goods.  But the mechanisms which ease the creation of small, geographically dispersed groups have also facilitated the creation of larger groups, new kinds of hierarchical authority, and new kinds of centralization and monitoring (e.g., via applications used by hundreds of millions of people, provided by companies like Google, Facebook, and Twitter).  It’s also allowed for new forms of influence by the same old powers-that-be, via techniques like astroturfing and faux amateur viral videos.

[A slightly different version of the above was written as a comment for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Tim K. for his comments (though I declined to move the paragraph you suggested).]

Wednesday, April 07, 2010

Many Species of Animal Law

Today I went to hear Bruce Wagman speak on the subject of "Many Species of Animal Law" at ASU's Sandra Day O'Connor College of Law.  Wagman, an attorney with Schiff Hardin who is also an outside litigator for the Animal Legal Defense Fund, has litigated cases involving animals for 18 years, written a case book on animal law, and teaches animal law courses at several law schools as an adjunct faculty member.  He was introduced by ASU Law School Dean Paul Berman and Arizona Court of Appeals Judge Pat Norris.

Wagman began by defining "animal law" as any law where the status of an animal matters--psychological, biological, welfare, etc. status of the animal, as opposed to its value as property.  He suggested that animal law attorneys "may be the only lawyers on earth whose clients are all innocent."

He divided his talk up into multiple "species" of animal law.

Species 1: Companion Animal Issues

He said this makes up the majority of his cases, and includes injuries by or to animals, including veterinary malpractice.  The challenge is to get courts to recognize that animals are not merely property, since historically companion animals have been viewed as property with low or even zero market value.  In cases where an animal is injured or killed, the market value doesn't recognize the interests of the animal or other kinds of value that companion animals give.  Under the American Law Institute's Restatements of the Law, however, there is a notion of "special property" (or "peculiar property" in California's statutes) which allows quantification of other kinds of worth to an animal owner, for instance if the animal is a therapy dog.  There are no emotional stress damages available.

Other sorts of companion animal cases include custody disputes, which often occur as a result of one partner just trying to inflict distress on another rather than having actual interest in the animal.  Wagman said that courts are beginning to take a better look at the interests of the animal in such cases, and be willing to appoint a guardian ad litem, as occurred in the Michael Vick case and in another case in Tennessee where there was a dispute over custody of a dog between a dead man's girlfriend and parents.

There are dangerous dog issues, where an attorney may be fighting against the classification of a dog as a dangerous or vicious animal, or against its euthanasia--what he called "capital cases" for animals.  In three counties surrounding San Francisco, what happens in the case of a dog biting another dog that requires stitches varies dramatically.  In one county, the dog gets a period of probation.  In another, the dog gets labeled as a dangerous or vicious dog, which requires the owner to meet various conditions of housing the dog, having a certain height of fence, carry additional insurance, and so forth.  And in Santa Clara County, the dog gets euthanized.  He pointed out that that county's statute has an exemption for "mitigating circumstances" which he's successfully used to prevent dogs from being euthanized.

Finally, there are wills and trusts--he said he doesn't do that sort of work, but that 48 states now have mechanisms for having trusts for animals.

He said he considers companion animals to be a sort of "gateway animal" for getting recognition of animals in the law, and noted that we tend to be "speciesists" who would feel very different about snakes vs. Labrador Retrievers.  [IMO, this is rational to the extent that animals differ in cognitive capacities, and I note that at no point did he discuss litigating on behalf of cockroaches against pest control companies.]

Species 2: Farm animal issues--legislation and litigation.

His second species of animal law was about animals killed for food--about 10 billion per year in the United States.  He said the goal here is not to stop the killing, but just to improve the living conditions of animals before they're killed for food.  This is problematic, however, because the animal cruelty statutes are criminal rather than civil (with an exception in North Carolina that will be discussed with regard to Species 3 of animal law), and that the criminal law for animal cruelty excludes farm animals in 35 states.  He discussed a few of the more abusive methods of animal treatment in factory farming--calf crates, in which calves are placed for about the first 60 days of life, gestation crates for pigs (outlawed in Arizona since 2006, as well as illegal in Florida, Oregon, Colorado, and California), and battery cages for chickens.

He also discussed downer animals--animals which are either so seriously injured or ill that they are unable to move, which the meat industry wants to drag in that condition to slaughter.  Wagman raised the concern that such animals, if sick, could potentially spread illness to humans, and listed a bunch of diseases that could potentially so spread, with BSE (mad cow) at the top of the list along with avian flu.  Of these, only BSE has been documented to spread to humans, and the industry position is that there should be no restrictions on downer pigs unless and until a human actually gets sick.  The state of California passed a law that said that all downer animals must be euthanized on the spot; the meat industry sued and overturned the statute in federal district court, but the 9th Circuit just reversed it last week (National Meat Association v. Brown).

Species 3: Animal hoarding--private ownership, breeders, and the sanctuary that is not

Wagman said that there have been 250,000 documented cases of animal hoarding, and that they are difficult cases to work with in multiple ways.  He said he believes such cases involve mental illness, but while the APA has a diagnosis for "hoarding" behavior, it excludes animal hoarding which is considered to be different.  How many animals constitutes hoarding?  He said he likes to say "more than eight," because he has eight animals at home.  Hoarders characteristics include possessing more animals than they can care for, having a sense of being persecuted, and living in deplorable conditions.

He discussed two cases that he litigated, ALDF v. Barbara & Robert Woodley, and ALDF v. Janie Conyers, which involved over 500 animals between them.  The former case, in North Carolina, was able to use North Carolina statute 19a, which allows a civil cause of action for animal cruelty.  Wagman had some horrifying photos from the Woodley case.  They had hundreds of dogs in their home living in their own feces, where ammonia levels were 20 times the USDA maximum allowed in a pig facility.  These ammonia levels caused blindness in the dogs, as well as chemical burns to bare skin that contacted the floor, such as dogs' scrotums.  Multiple dogs were kept in wooden boxes with lids on them, and never let out.  Mrs. Woodley's favorite dog, Buddy, not only had his eyes burned to blindness from ammonia, but the bone in the dog's jaw deteriorated from malnutrition.  Local officials had known of Woodley's problem for 20 years, but considered themselves powerless to do anything about it, since the scale of the problem was so large--the local shelter had only eight kennels, while the Woodleys had about 450 animals.  The ALDF had to coordinate a massive effort to manage the rescue of the animals through their case.

Conyers was an AKC poodle breeder who had 106 poodles living in their own feces.

Wagman said that animal psychological suffering is difficult to show, but it can be done; demonstrating physiological suffering is easier, with objective criteria like the ammonia levels and physical injuries to animals.

There is no law against hoarding (except in Hawaii), just the criminal abuse statutes (and civil in NC).  In the hoarding cases the abuse is typically neglect rather than active abuse.

Species 4: Exotic animal ownership

Wagman has handled about 10 chimpanzee cases.  One was a case involving a couple in West Covina, California who had a chimp named Moe for 35 years that bit two people.  He argued for a guardian ad litem to determine what was in the best interests of the chimp, and arranged to get Jane Goodall and Roger Fouts for that role.  The court looked upon it favorably, but the couple came to an out-of-court settlement.

He also briefly discussed the Stamford, Connecticut case of Travis, the 200-pound chimpanzee who attacked a woman that was in the news last year.

He argued that there should be a legislative fix to ban exotic animal ownership completely--they're wild animals.  [A complete ban seems to me too much--there should be exceptions for research, conservation, breeding programs for endangered species, and so forth.  And shouldn't it be possible to domesticate other wild animals?]  Connecticut has taken the step of banning chimp ownership.

Species 5: Shelter practices - euthanasia, veterinary care, adequate food, water, and sanitation, and hold periods

Animal shelters have an overwhelming job, said Wagman.  The County of Los Angeles, which he sued, operates seven shelters which handle tens of thousands of animals per year.  California law says that all animals must get veterinary care and be held for five days, and allowing animal suffering without treatment is not permissible.  The shelters' own records showed that they weren't meeting that standard for thousands of animals, but they're now working to meet them and having their activity monitored for compliance.  A similar set of cases occurred in Kentucky, when the state transferred all shelter responsibility to the counties.  Although the standards of care were minimal, they weren't meeting it, and there were nutrition, veterinary care, and euthanasia issues.  Upon getting notice, they quickly took action to remedy.

In Georgia, by contrast, there is a statute that prohibits the use of gas chambers for euthanization at shelters, but the Commissioner of Agriculture sent out letters to the shelters asking that they purchase gas chambers for euthanization.  Gas chambers apparently have very ugly results in some cases, such as with unhealthy dogs.  A lawsuit against the state of Georgia for its failure to comply with its own statute resulted an an injunction, which they then immediately violated by sending out more letters asking for gas chamber purchases.  After obtaining a contempt ruling from the court, they finally got compliance.

Species 6: Entertainment

Wagman called this category both the most obvious and the most hidden.  The use of animals in entertainment is obvious, but what is not obvious is what goes on behind the scenes, the knowledge of which drains the fun out of the entertainment.

Circuses, zoos, film and TV ads, animal fighting, public appearances, racing and rodeos, and hunting and fishing are all cases of animals used for entertainment.  Wagman first discussed elephants in circuses, commenting on a recent Ringling Brothers case which was tossed out on an issue of standing.  The case involved the use of bullhooks for elephant training, which injures the animals.  The defense didn't deny use of bullhooks, but claimed that they only use them as "guides."

Elephant treatment in zoos is also problematic, since standing around on hard surfaces causes painful arthritis.  In the wild, elephants are awake 21 hours a day and may move 35 miles per day.

Wagman discussed dog fighting, and said that the Michael Vick case was a wakeup call for America to the reality of dog fighting, which exists in every state and most major cities.

He argued that the use of great apes in film and television should be banned, because of how the training process works.  He said that while trainers claim to use only positive reinforcement training, an undercover person who volunteered for a year and a half with trainer Sid Yost found otherwise.  A young chimpanzee is immediately treated to beating and punching to get them to comply.  Their performance lifetime is about 3-5 years, after which they become to strong to conrol, and end up in private homes, in research, or in zoos, often all alone in barren cases.  Wagman pointed out that the common use of a "smiling" chimpanzee is actually a fear grimace.  He does lots of work for sanctuaries, of which there are nine in the U.S. for chimpanzees (including chimpsanctuarynw.org).

Regarding hunting, he distinguished traditional hunting from canned hunting and Internet hunting.  Hunting is protected in most states, including in many state constitutions.  Canned hunting ranches, where animals are fed by hand by humans before they are flushed out into open areas to be shot, are not considered to be hunting by most traditional hunters.  [But is considered hunting by our former Vice President, Dick Cheney.]  Internet hunting, where a rifle can be fired at live animals over the Internet, has been banned in 30 states.

He mentioned mountain lion hunting in the Black Hills of South Dakota, where mountain lions have become fairly scarce.  A lawsuit was filed to try to stop the hunting on grounds of near-extinction of the animals, but the injunction was denied on the grounds that there were unlikely to be any mountain lions even found and killed.  Two mountain lions were killed shortly thereafter in fairly quick succession, and even though there was a law that prohibited killing female mountain lions with cubs, the second one killed had a cub, and there was no prosecution.

Some Adidas shoes are made with kangaroo skin, and the state of California has banned the importation of kangaroo skin, which Adidas ignored.  Adidas was sued as a result, and they lost at the California Supreme Court--but they responded by persuading the legislature to repeal the ban rather than changing their practices.

Species 7: Species and breed-specific legislation and ADA breedism case.

A variety of dog breeds have been considered at various times and places to be "bad dogs" that create a special danger.  After WWII, it was German Shepherds and Dobermans.  All cases to stop such breed-specific legislation have failed, because the "rational relation" standard is met by only a single case of harm.  A case in progress right now in Concord, California involves Theresa Huerta, a woman suing under the Americans with Disabilities Act to keep her pit bull therapy dog from being euthanized.

Wagman concluded by saying that his overall objective is to keep the public and the courts focused on the real issue, which is ending blatant cases of animal abuse.  Animal law is a growing field, and there's an annual animal law conference in Portland that's now in its fifth year.

Tuesday, April 06, 2010

First two stray dogs of 2010

I caught these two male dogs in the front yard this afternoon--they wandered in while the gate was open, and I closed it to catch them.  No collars, no tags, and the pit mix was unneutered (didn't check the Spitz mix or whatever he is).  At first they were very skittish, but after they finally approached me, both wanted my constant attention.  They were both quickly picked up by the Maricopa County pound--I'm sure they'll get taken to the east side.

As I was closing the gate to catch these guys, I heard a car honk its horn and a dog yelp, and looked up to see the car drive away as a man, woman, and dog stood on the sidewalk, the dog limping.  I asked the man if the dog had just been hit, and if it was his dog, and he answered yes to both.  They walked off, the dog limping (and off leash, with no collar or tags).

Please, if you own animals, be a responsible pet owner.

Against "coloring book" history of science

It's a bad misconception about evolution that it proceeds in a linear progression of one successfully evolving species after another displacing its immediate ancestors.  Such a conception of human history is equally mistaken, and is often criticized with terms such as "Whiggish history" or "determinism" with a variety of adjectives (technological, social, cultural, historical).  That includes the history of science, where the first version we often hear is one that has been rationally reconstructed by looking back at the successes and putting them into a linear narrative.  Oh, there are usually a few errors thrown in, but they're usually fit into the linear narrative as challenges that are overcome by the improvement of theories.

The reality is a lot messier, and getting into the details makes it clear that not only is a Whiggish history of science mistaken, but that science doesn't proceed through the algorithmic application of "the scientific method," and in fact that there is no such thing as "the scientific method."  Rather, there is a diverse set of methods that are themselves evolving in various ways, and sometimes not only do methods which are fully endorsed as rational and scientific produce erroneous results, sometimes methods which have no such endorsement and are even demonstrably irrational fortuitously produce correct results.  For example, Johannes Kepler was a neo-pythagorean number mystic who correctly produced his second law of planetary motion by taking an incorrect version of the law based on his intuitions and deriving the correct version from it by way of a mathematical argument that contained an error.  Although he fortuitously got the right answer and receives credit for devising it, he was not justified in believing it to be true on the basis of his erroneous proof.  With his first law, by contrast, he followed an almost perfectly textbook version of the hypothetico-deductive model of scientific method of formulating hypotheses and testing them against Tycho Brahe's data.

The history of the scientific revolution includes numerous instances of new developments occurring piecemeal, with many prior erroneous notions being retained.  Copernicus retained not only perfectly circular orbits and celestial spheres, but still needed to add epicycles to get his theory any where close to the predictive accuracy of the Ptolemaic models in use.  Galileo insisted on retaining perfect circles and insisting that circular motion was natural motion, refusing to consider Kepler's elliptical orbits.  There seems to be a good case for "path dependence" in science.  Even the most revolutionary changes are actually building on bits and pieces that have come before--and sometimes rediscovering work that had already been done before, like Galileo's derivation of the uniform acceleration of falling bodies that had already been done by Nicole Oresme and the Oxford calculators.  And the social and cultural environment--not just the scientific history--has an effect on what kinds of hypotheses are considered and accepted.

This conservativity of scientific change is a double-edged sword.  On the one hand, it suggests that we're not likely to see claims that purport to radically overthrow existing theory (that "everything we know is wrong") succeed--even if they happen to be correct.  And given that there are many more ways to go wrong than to go right, such radical revisions are very likely not to be correct.  Even where new theories are correct in some of their more radical claims (e.g., like Copernicus' heliocentric model, or Wegener's continental drift), it often requires other pieces to fall into place before they become accepted (and before it becomes rational to accept them).  On the other hand, this also means that we're likely to be blinded to new possibilities by what we already accept that seems to work well enough, even though it may be an inaccurate description of the world that is merely predictively successful.  "Consensus science" at any given time probably includes lots of claims that aren't true.

My inference from this is that we need both visionaries and skeptics, and a division of cognitive labor that's largely conservative, but with tolerance for diversity and a few radicals generating the crazy hypotheses that may turn out to be true.  The critique of evidence-based medicine made by Kimball Atwood and Steven Novella--that it fails to consider prior plausibility of hypotheses to be tested--is a good one that recognizes the unlikelihood of radical hypotheses to be correct, and thus that huge amounts of money shouldn't be spent to generate and test them.  (Their point is actually stronger than that, since most of the "radical hypotheses" in question are not really radical or novel, but are based on already discredited views of how the world works.)  But that critique shouldn't be taken to exclude anyone from engaging in the generation and test of hypotheses that don't appear to have a plausible mechanism, because there is ample precedent for new phenomena being discovered before the mechanisms that explain them.

I think there's a tendency among skeptics to talk about science as though it's a unified discipline, with a singular methodology, that makes continuous progress, and where the consensus at any moment is the most appropriate thing to believe.  The history of science suggests, on the other hand, that it's composed of multiple disciplines, with multiple methods, that proceeds in fits and starts, that has dead-ends, that sometimes rediscovers correct-but-ignored past discoveries, and is both fallible and influenced by cultural context.  At any given time, some theories are not only well-established but unified well with others across disciplines, while others don't fit comfortably well with others, or may be idealized models that have predictive efficacy but seem unlikely to be accurate descriptions of reality in their details.  To insist on an overly rationalistic and ahistorical model is not just out-of-date history and philosophy of science, it's a "coloring book" oversimplification.  While that may be useful for introducing ideas about science to children, it's not something we should continue to hold to as adults.

Friday, April 02, 2010

Scientific autonomy, objectivity, and the value-free ideal

It has been argued by many that science, politics, and religion are distinct subjects that should be kept separate, in at least one direction if not both.  Stephen Jay Gould argued that science and religion have non-overlapping areas of authority (NOMA, or non-overlapping magisteria), with the former concerned about how questions and the latter with why questions, and that conflicts between them won’t occur if they stick to their own domain.  Between science and politics, most have little problem with science informing politics, but a big problem with political manipulation of science.  Failure to properly maintain the boundaries leads to junk science, politicized science, scientism, science wars, and other objectionable consequences.

Heather E. Douglas, in Science, Policy, and the Value-Free Ideal argues that notions of scientific autonomy and a scientific ideal of being isolated from questions of value (political or otherwise) are mistaken, and that this idea of science without regard to value questions (apart from epistemic virtues) is itself a contributing factor to such consequences.  She attributes blame for this value-free ideal of science to post-1940 philosophy of science, though the idea of scientific autonomy appears to me to have roots much further back, including in Galileo’s “Letter to Castelli” and "Letter to the Grand Duchess Christina" and John Tyndall’s 1874 Belfast Address, which were more concerned to argue that religion should not intrude into the domain of science rather than the reverse.  (As I noted in a previous post about Galileo, he did not carve out complete autonomy for natural philosophy from theology, only for those things which can be demonstrated or proven, which he argued that scripture could not contradict--and where it apparently does, scripture must be interpreted allegorically.)

Douglas describes a “topography of values” in the categories of cognitive, ethical, and social values, and distinguishes direct and indirect roles for them.  Within the “cognitive” category go values pertaining to our ability to understand evidence, such as simplicity, parsimony, fruitfulness, coherence, generality, and explanatory power, but excluding truth-linked epistemic virtues such as internal consistency and predictive competency or adequacy, which she identifies not as values but as minimal negative conditions that theories must necessarily meet.  Ethical values and social values are overlapping categories, the former concerned with what’s good or right and the latter with what a particular society values, such as “justice, privacy, freedom, social stability, or innovation” (Douglas, p. 92).  Her distinction between a direct and indirect role is that the former means that values can act directly as reasons for decisions, versus indirectly as a factor in decision-making where evidence is uncertain.

Douglas argues that values can legitimately play a direct role in certain phases of science, such as problem selection, selection of methodology, and in the policy-making arena, but should be restricted to an indirect role in phases such as data collection and analysis and drawing conclusions from evidence.  She identifies some exceptions, however--problem selection and method selection can’t legitimately be guided by values in a way that undermines the science by forcing a pre-determined conclusion (e.g., by selecting a method that is guaranteed to be misleading), and a direct role for ethical values can surface in later stages by discovering that research is causing harm.

Her picture of science is one where values cannot directly intrude between the collection of data and the inference of the facts from that data, but the space between evidence and fact claims is somewhat more complex than she describes.  There is the inference by a scientist of a fact from the evidence, the communication of that fact to other scientists, the publication of that fact in the scientific literature, and its communication to the general public and policy makers.  All but the first of these are not purely epistemic, but are also forms of conduct.  It seems to me that there is, in fact, a potential direct role for ethical values, at the very least, for each such type of conduct, in particular circumstances, which could merit withholding of the fact claim.  For example, a scientist in Nazi Germany could behave ethically by withholding information about how to build an atomic bomb.

Douglas argues that the motivation for the value-free ideal is as a mechanism for preserving scientific objectivity; she therefore gives an account of objectivity that comports with her account of science with values.  She identifies seven types of objectivity that are relevant in three different domains (plus one she rejects), all of which have to do with a shared ground for trust.  First, within the domain of human interactions with the world, are “manipulable objectivity,” or the ability to repeatably and reliably make interventions in nature that give the same result, and “convergent objectivity,” or having supporting evidence for a conclusion from multiple independent lines of evidence.  Second, in the realm of individual thought processes, she identifies “detached objectivity”--a scientific disinterest, freedom from bias, and eschewing the use of values in place of evidence.  There’s also “value-free objectivity,” the notion behind the value-free ideal, which she rejects.  And there’s “value-neutral objectivity,” or leaving personal views aside in, e.g., conducting a review of the literature in a field and identifying possible sets of explanations, or taking a "centrist" or "balanced" view of potentially relevant values.  Finally, in the domain of social processes, Douglas identifies “procedural objectivity,” where use of the same procedures produces the same results regardless of who engages in the procedure, and “intersubjectivity” in two senses--“concordant objectivity,” agreement in judgments between different people, and “interactive objectivity,” agreement as the result of argument and deliberation.

Douglas writes clearly and concisely, and makes a strong case for the significance of values within science as well as in its application to public policy.  Though she limits her discussion to natural science (and focuses on scientific discovery rather than fields of science that involve the production of new materials, an area where more direct use of values is likely appropriate), her account could likely be extended with the introduction of a bit more complexity.  While I don’t think she has identified all or even the primary causes of the “science wars,” which she discusses at the beginning of her book, I think her account is more useful in adjudicating the “sound science”/“junk science” debate that she also discusses, as well as identifying a number of ways in which science isn’t and shouldn’t be autonomous from other areas of society.

[A slightly different version of the above was written as a comment for my Human and Social Dimensions of Science and Technology core seminar. Thanks to Judd A. for his comments.]

Thursday, April 01, 2010

Galileo on the relation between science and religion

Galileo’s view of natural philosophy (science) is that it is the study of the book of nature,” “written in mathematical language” (Finocchiaro 2008, p. 183), as contrasted with theology, the study of the book of Holy Scripture and revelation.  Galileo endorses the idea that theology is the “queen” of the “subordinate sciences” (Finocchiaro 2008, p. 124), by which he means not that theology trumps science in any and all matters.  He distinguishes two senses of theology being “preeminent and worthy of the title of queen”: (1) That “whatever is taught in all the other sciences is found explained and demonstrated in it [theology] by means of more excellent methods and of more sublime principles,” [Note added 12/14/2012: which he rejects] and (2) That theology deals with the most important issues, “the loftiest divine contemplations” about “the gaining of eternal bliss,” but “does not come down to the lower and humbler speculations of the inferior sciences ... it does not bother with them inasmuch as they are irrelevant to salvation” [Note added 12/14/2012: which he affirms] (quotations from Finocchiaro 2008, pp. 124-125).  Where Holy Scripture makes reference to facts about nature, they may be open to allegorical interpretation rather than literal interpretation, unless their literal truth is somehow necessary to the account of “the gaining of eternal bliss.”

Galileo further distinguishes two types of claims about science:  (1) “propositions about nature which are truly demonstrated” and (2) “others which are simply taught” (Finocchiaro 2008, p. 126).  The role of the theologian with regard to the former category is “to show that they are not contrary to Holy Scripture,” e.g., by providing an interpretation of Holy Scripture compatible with the proposition; with regard to the latter, if it contradicts Holy Scripture, it must be considered false and demonstrations of the same sought (Finocchiaro 2008, p. 126).  Presumably, if in the course of attempting to demonstrate that a proposition in the second category is false, it is instead demonstrated to be true, it then must be considered to be part of the former category.  Galileo’s discussion allows that theological condemnation of a physical proposition may be acceptable if it is shown not to be conclusively demonstrated (Finnochiaro 2008, p. 126), rather than a more stringent standard that it must be conclusively demonstrated to be false, which, given his own lack of conclusive evidence for heliocentrism, could be considered a loophole allowing him to be hoist with his own petard.

Galileo also distinguishes between what is apparent to experts vs. the layman (Finnochiaro 2008, p. 131), denying that popular consensus is a measure of truth, but regarding that this distinction is what lies behind claims made in Holy Scripture about physical propositions that are not literally true.  With regard to the theological expertise of the Church Fathers, their consensus on a physical proposition is not sufficient to make it an article of faith unless such consensus is upon “conclusions which the Fathers discussed and inspected with great diligence and debated on both sides of the issue and for which they then all agreed to reject one side and hold the other” (Finnochiaro 2008, p. 133).  Or, in a contemporary (for Galileo) context, the theologians of the day could have a comparably weighted position on claims about nature if they “first hear the experiments, observations, reasons, and demonstrations of philosophers and astronomers on both sides of the question, and then they would be able to determine with certainty whatever divine inspiration will communicate to them” (Finnochiaro 2008, p. 135).

Galileo’s conception of science that leads him to take this position appears to be drawn from what Peter Dear (1990, p. 664), drawing upon Thomas Kuhn (1977), calls “the quantitative, ‘classical’ mathematical sciences” or the “mixed mathematical sciences,” identifying this as a predominantly Catholic conception of science, as contrasted with experimental science developed in Protestant England.  The former conception is one in which laws of nature can be recognized through idealized thought experiments based on limited (or no) actual observations, but demonstrated conclusively by means of rational argument.  This seems to be the general mode of Galileo’s work.  Dear argues that this notion of natural law allows for a conception of the “ordinary course of nature” which can be violated by an observed miraculous event, which comports with a Catholic view that miracles continue to occur in the world.

By contrast, the experimentalist views of Francis Bacon and Robert Boyle involve inductively inferring natural laws on the basis of observations, in which case observing something to occur makes it part of nature that must be accounted for in the generalized law--a view under which a miracle seems to be ruled out at the outset, which was not a problem for Protestants who considered the “age of miracles” to be over (Dear 1990, pp. 682-683).  Dear argues that for the British experimentalists, authentication of an experimental result was in some ways like the authentication of a miracle for the Catholics--requiring appropriately trustworthy observations--but that instead of verifying a violation of the “ordinary course of nature,” it verified what the “ordinary course of nature” itself was (Dear 1990, p. 680).  Where the Catholics like Galileo and Pascal derived conclusions about particulars from universal laws recognized by observation, reasoning, and mathematical demonstration, the Protestants like Bacon and Boyle constructed universal laws by inductive generalization from observations of particulars, and were notably critical of failing to perform a sufficient number of experiments before coming to conclusions (McMullin 1990, p. 821), and put forth standards for hypotheses and experimental method (McMullin 1990, p. 823; Shapin & Schaffer 1985, pp. 25ff & pp. 56-59).  The English experimentalist tradition, arising at a time of political and religious confusion after the English Civil War and the collapse of the English state church, was perhaps an attempt to establish an independent authority for science.  By the 19th century, there were explicit (and successful) attempts to separate science from religious authority and create a professionalized class of scientists (e.g., as Gieryn 1983, pp. 784-787 writes about John Tyndall).

The English experimentalists followed the medieval scholastics (Pasnau, forthcoming) in adopting a notion of “moral certainty” for “the highest degree of probabilistic assurance” for conclusions adopted from experiments (Shapin 1994, pp. 208-209).  This falls short of the Aristotelian conception of knowledge, yet is stronger than mere opinion.  They also placed importance on public demonstration in front of appropriately knowledgeable witnesses--with both the credibility of experimenter and witness being relevant to the credibility of the result.  Where on Galileo’s conception expertise appears to be primarily a function of possessing rational faculties and knowledge, on the experimentalist account there is importance to skill in application of method and to the moral trustworthiness of the participants as a factor in vouching for the observational results.  In the Galilean approach, trustworthiness appears to be less relevant as a consequence of actual observation being less relevant--though Galileo does, from time to time, make remarks about observations refuting Aristotle, e.g., in “Two New Sciences” where he criticizes Aristotle’s claims about falling bodies (Finnochiaro 2008, pp. 301, 303).

The classic Aristotelian picture of science is similar to the Galilean approach, in that observation and data collection is done for the purpose of recognizing first principles and deriving demonstrations by reason from those first principles.  What constitutes knowledge is what can be known conclusively from such first principles and what is derived by necessary connection from them; whatever doesn’t meet that standard is mere opinion (Posterior Analytics, Book I, Ch. 33; McKeon 1941, p. 156).  The Aristotelian picture doesn’t include any particular deference to theology; any discipline could could potentially yield knowledge so long as there were recognizable first principles. The role of observation isn’t to come up with fallible inductive generalizations, but to recognize identifiable universal and necessary features from their particular instantiations (Lennox 2006).  This discussion is all about theoretical knowledge (episteme) rather than practical knowledge (tekne), the latter of which is about contingent facts about everyday things that can change.  Richard Parry (2007) points out an apparent tension in Aristotle between knowledge of mathematics and knowledge of the natural world on account of his statement that “the minute accuracy of mathematics is not to be demanded in all cases, but only in the case of things which have no matter.  Hence its method is not that of natural science; for presumably the whole of nature has matter” (Metaphysics, Book II, Ch. 3, McKeon 1941, p. 715).

The Galilean picture differs from the Aristotelian in its greater use of mathematics (geometry)--McMullin writes that Galileo had “a mathematicism ... more radical than Plato’s” (1990, pp. 822-823) and by its inclusion of the second book, that of revelation and Holy Scripture, as a source of knowledge.  But while the second book is one which can trump mere opinion--anything that isn’t conclusively demonstrated and thus fails to meet Aristotle’s understanding of knowledge--it must be held compatible with anything that does meet those standards.

References
  • Peter Dear (1990) “Miracles, Experiments, and the Ordinary Course of Nature,” ISIS 81:663-683.
  • Maurice A. Finocchiaro, editor/translator (2008) The Essential Galileo.  Indianapolis: Hackett Publishing Company.
  • Thomas Gieryn (1983) “Boundary Work and the Demarcation of Science from Non-Science: Strains and Interests in Professional Ideologies of Scientists,” American Sociological Review 48(6, December):781-795.
  • Thomas Kuhn (1957) The Copernican Revolution: Planetary Astronomy in the Development of Western Thought.  Cambridge, Mass.: Harvard University Press.
  • Thomas Kuhn (1977) The Essential Tension.  Chicago: The University of Chicago Press.
    Lennox, James (2006) “Aristotle’s Biology,” Stanford Encyclopedia of Philosophy, online at http://plato.stanford.edu/entries/aristotle-biology/, accessed March 18, 2010.
  • Richard McKeon (1941) The Basic Works of Aristotle. New York: Random House.
  • Ernan McMullin (1990) “The Development of Philosophy of Science 1600-1900,” in Olby et al. (1990), pp. 816-837.
  • R.C. Olby, G.N. Cantor, J.R.R. Christie, and M.J.S. Hodge (1990) Companion to the History of Science.  London: Routledge.
  • Parry, Richard (2007) “Episteme and Techne,” Stanford Encyclopedia of Philosophy, online at http://plato.stanford.edu/entries/episteme-techne/, accessed March 18, 2010.
  • Robert Pasnau (forthcoming) “Medieval Social Epistemology: Scienta for Mere Mortals,” Episteme, forthcoming special issue on history of social epistemology.  Online at http://philpapers.org/rec/PASMSE, accessed March 18, 2010. 
  • Steven Shapin and Simon Schaffer (1985) Leviathan and the Air Pump: Hobbes, Boyle, and the Experimental Life.  Princeton, N.J.: Princeton University Press.
  • Steven Shapin (1994) A Social History of Truth: Civility and Science in Seventeenth-Century England. Chicago: The University of Chicago Press.
[The above is slightly modified from one of my answers on a midterm exam.  My professor observed that another consideration on the difference between Catholic and Protestant natural philosophers is that theological voluntarism, more prevalent among Protestants, can suggest that laws of nature are opaque to human beings except through inductive experience.  NOTE ADDED 13 April 2010: After reading a couple of chapters of Margaret Osler's Divine Will and the Mechanical Philosophy: Gassendi and Descartes on Contingency and Necessity in the Created World (2005, Cambridge University Press), I'd add Pierre Gassendi to the experimentalist/inductivist side of the ledger, despite his being a Catholic--he was a theological voluntarist.]