tag:blogger.com,1999:blog-15453937.post7804474369640458780..comments2024-01-10T17:36:15.040-07:00Comments on The Lippard Blog: Is the general public really that ignorant? Public understanding of science vs. civic epistemologyLippardhttp://www.blogger.com/profile/16826768452963498005noreply@blogger.comBlogger11125tag:blogger.com,1999:blog-15453937.post-85908436736278318952010-04-21T08:07:51.929-07:002010-04-21T08:07:51.929-07:00I think you need a certain amount to be built in (...I think you need a certain amount to be built in (evolved, really) in order for reasoning to get off the ground in the first place, and that many of the cognitive tools we use are learned (e.g., look at the development of mathematics, logic, science, and philosophy). Pollock also focuses on the individual cognizer, and not groups and institutions, which also play a part in how knowledge gets produced and validated.Lippardhttps://www.blogger.com/profile/16826768452963498005noreply@blogger.comtag:blogger.com,1999:blog-15453937.post-81268566069535094942010-04-20T20:44:38.485-07:002010-04-20T20:44:38.485-07:00Jim, you're right. I see that now. I was skimm...Jim, you're right. I see that now. I was skimming through it really fast this morning.<br /><br />He says a lot of interesting things in that paper. I do wonder though at:<br /><br /><i>"To do that, we must be able to inspect candidate expansion of argument sketches and evaluate them as good or bad arguments. But that just amounts to judging whether, if we reasoned in that way, we would be conforming to the dictates of rationality. Thus an essential feature of rational cognition must be the <b>built-in ability to judge whether particular bits of cognitive behavior conform to the dictates of rationality.</b>"</i><br /><br />How much of that ability is built into humans and how much is learned? I suspect that whatever built-in abilities we have along that line are fairly minimal and the rest come from learning. If that's the case though then we might be treading on the dangerous ground of saying that it's possible for some cultures to be more or less rational than others.Lhttps://www.blogger.com/profile/09230447874923339087noreply@blogger.comtag:blogger.com,1999:blog-15453937.post-66396019512200345342010-04-20T07:49:25.088-07:002010-04-20T07:49:25.088-07:00Neural Gourmet: "I think he's saying that...Neural Gourmet: "I think he's saying that any time in which we don't engage in explicit reasoning, i.e. not relying on a Q&I module, then we are thinking irrationally."<br /><br />No, he doesn't quite say that--rather, he says that when we have reason to believe that explicit reasoning gives a different result than a Q&I module, we need to go with the explicit reasoning on pain of being irrational--see the penultimate sentence of section 3. This seems to me compatible with your position (and it seems right to me).Lippardhttps://www.blogger.com/profile/16826768452963498005noreply@blogger.comtag:blogger.com,1999:blog-15453937.post-70908740744445670952010-04-20T07:08:28.877-07:002010-04-20T07:08:28.877-07:00Thanks for the link to Pollock's paper (which ...Thanks for the link to Pollock's paper (which is incredibly readable for that sort of thing). I just skimmed through it but he has some interesting ideas, principally that irrational behavior solely results from an inability to override Q&I ("Quick and Inflexible") modules. He seems to be talking about innate heuristics, such as our ability to very quickly predict trajectories of flying objects. We don't have to stop and reason through where a tossed ball will be in order to catch it. We just "know" and act accordingly.<br /><br />His Q&I modules appear to be just another name for heuristics to me, although perhaps he means something broader. I'll have to read the full paper but he's certainly right I think in saying that heuristics are an epistemic condition. Indeed, I think modern neurosci research has shown that it's very hard for people to disregard a highly successful heuristic and reason through a problem instead. The Wason Selection Task is a demonstration of that I believe.<br /><br />So by defining rationality as the inability to override heuristics or Q&I modules, it seems that Pollock takes a stricter approach to rationality than I do. I think he's saying that any time in which we don't engage in explicit reasoning, i.e. not relying on a Q&I module, then we are thinking irrationally. <br /><br />My personal view is that heuristics are part and parcel of rational thinking and that irrational thinking occurs when a person indiscriminately relies on them even after having had that heuristic demonstrated as not applying in a certain situation. In fact, in the majority of cases, relying on heuristics is the rational thing to do because heuristics allow us to come to decisions quickly with little energy expenditure. If we see an ad on TV that makes an offer that seems "too good to be true" we know from experience that it probably is, so the safe bet is to simply assume that it's bogus or a scam.<br /><br />Where we get into trouble is when we attempt to apply a heuristic to a situation that's novel and outside our everyday experiences. For instance, 9-11 conspiracy theorists often rely on what they think a controlled demolition of a building looks like to conclude that WTC building 7 was brought down by a controlled demolition. Their heuristic of what a controlled demolition looks like isn't at fault. The collapse of WTC 7 <i>does</i> look a lot like what a controlled demolition looks like. The problem is that they're relying on that heuristic instead of considering that there might be multiple causes for a building's collapse that will result in a collapse that looks a lot like a controlled demolition.<br /><br />Now, I don't think applying the "controlled demolition" heuristic to the collapse of WTC 7 is the irrational act. To me the irrationality comes in when after a 9-11 conspiracy theorist has had it demonstrated multiple times that they're idea of what a controlled demolition looks like doesn't apply in this case, and is counterfactual (i.e. we know WTC 7 collapsed due to structural damage from falling debris and fire).Lhttps://www.blogger.com/profile/09230447874923339087noreply@blogger.comtag:blogger.com,1999:blog-15453937.post-10310796804277995992010-04-19T22:38:17.378-07:002010-04-19T22:38:17.378-07:00Neural Gourmet: When I was waist-deep in epistemol...Neural Gourmet: When I was waist-deep in epistemology, most of the philosophers I read tended to avoid the term "rationality" in favor of "knowledge," "justification," "epistemic norms," and similar talk, just because there were too many conflicting notions of rationality. Cherniak is an exception.<br /><br />One of my professors at UA, the late John Pollock, did talk about rationality as rules of human reasoning, which he attempted to implement in an artificial intelligence system called OSCAR. Your question led me to find this paper of his online, <a href="http://oscarhome.soc-sci.arizona.edu/ftp/PAPERS/Epistemology,%20rationality,%20and%20cognition.pdf" rel="nofollow">"Epistemology, Rationality, and Cognition."</a><br /><br />There are definitely multiple concepts out there, including rules or norms about what it is reasonable to believe and rules or norms about what it is reasonable to do. And disagreement how those rules all fit together into a coherent whole.<br /><br />Pollock, Harman, and Cherniak all are writing about defeasible, non-monotonic reasoning, where your set of beliefs can contract as well as expand, as new beliefs defeat or undermine existing beliefs. Cherniak proposes a condition of "minimal consistency", where a rational agent will act to eliminate some but not necessarily all inconsistencies; Harman adopts a principle of conservatism that says we tend to retain beliefs and only clean up inconsistencies as they become salient.<br /><br />Economists, game theorists, and decision theorists have different notions of rationality--out of my field.Lippardhttps://www.blogger.com/profile/16826768452963498005noreply@blogger.comtag:blogger.com,1999:blog-15453937.post-59162959292378711672010-04-19T19:27:53.111-07:002010-04-19T19:27:53.111-07:00"These heuristics and biases are important fo...<i>"These heuristics and biases are important for us to know and guard against, especially in institutional and policy-related contexts, but they may not be all-things-considered irrational in practice."</i><br /><br />Jim, you're much more aware of the literature than I am. Is it any easier trying to come up with what clearly counts as irrational, then backing off from there to get at an idea of what's rational?<br /><br />Actually, I'm thinking that the kind of rationality we think of, that we inherited from the Enlightenment philosophers, is actually a whole bunch of separate things all taken together.Lhttps://www.blogger.com/profile/09230447874923339087noreply@blogger.comtag:blogger.com,1999:blog-15453937.post-26725138003138520192010-04-19T16:43:04.075-07:002010-04-19T16:43:04.075-07:00Satisficing in the evolutionary environment is mor...Satisficing in the evolutionary environment is more likely than optimality, and that may mean worse than satisfactory in more abstruse intellectual environments.<br /><br />From a philosophical perspective, when you try to get a universal, optimal rationality, obstacles arise not just from practical considerations (which are large enough), but from limitations we've discovered in mathematics and logic--which motivated philosophical treatments like Christopher Cherniak's _Minimal Rationality_ (1986, MIT Press) and Gilbert Harman's _Change in View: Principles of Reasoning_ (1986, MIT Press).<br /><br />The data in Kahneman, Slovic, and Tversky's _Judgment under Uncertainty: Heuristics and Biases_ (1982, Cambridge Univ. Press), while it shows failings against an optimal standard, isn't really so bad as it originally seemed, was it? These heuristics and biases are important for us to know and guard against, especially in institutional and policy-related contexts, but they may not be all-things-considered irrational in practice.<br /><br />An important critique of the public understanding of science model, that I didn't make explicitly, is that it really is just a test of rote memorization, rather than reasoning skill or scientific methodology.Lippardhttps://www.blogger.com/profile/16826768452963498005noreply@blogger.comtag:blogger.com,1999:blog-15453937.post-37721030171667438962010-04-19T14:10:31.363-07:002010-04-19T14:10:31.363-07:00@badrescher I've yet to read any of Gerd Giger...@badrescher I've yet to read any of Gerd Gigerenzer's books, but I do think that oft times many skeptics and atheists work with an overly restrictive definition of rationality.Lhttps://www.blogger.com/profile/09230447874923339087noreply@blogger.comtag:blogger.com,1999:blog-15453937.post-64565827501200732682010-04-19T13:55:30.368-07:002010-04-19T13:55:30.368-07:00@neuralgourmet, true, but I don't define it li...@neuralgourmet, true, but I don't define it like Gigerenzer does and I think that definition is a bit of a cop-out. <br /><br />What seems to be the most "settled" definition in the literature is 2-fold (holding beliefs consistent with evidence is part of it), but it certainly involves optimal choices, not "whatever works" choices. By those definitions and mine, I find Ken Miller rational enough and for the same reasons you do. Limiting the scope of what one is willing to question is the opposite of irrational, imo. <br /><br />Whether or not it is hypocritical or contradictory is another story, but rational? Sure it is. At least until we start talking about WHY he limits it... ;)badrescherhttps://www.blogger.com/profile/04719915350370585943noreply@blogger.comtag:blogger.com,1999:blog-15453937.post-15184997045133486682010-04-19T13:38:30.839-07:002010-04-19T13:38:30.839-07:00@badrescher I think it depends on how you define r...@badrescher I think it depends on how you define rational. I also don't find Maher to be very rational because he doesn't seem to come to reasoned conclusions. He doesn't think; just reacts. Further, he seems impervious to expert testimony. Actually, let me rephrase that. He seems unable to determine which experts to trust.<br /><br />But what about that old-time atheist whipping boy Ken Miller? Many atheists would say he isn't rational despite his scientific credentials because he doesn't follow the science where they think it should take him in respect to his religion. Yet I have no problem calling Miller rational. It's true, he operates within a bounded rationality but then I think we all, everyone of us, do.<br /><br />So I think defining rational is a very hard thing to do and perhaps we almost have to go with the tautological "rational is what rational people do/think."<br /><br />@Jim Fascinating post.Lhttps://www.blogger.com/profile/09230447874923339087noreply@blogger.comtag:blogger.com,1999:blog-15453937.post-80067279843106055172010-04-19T13:04:37.111-07:002010-04-19T13:04:37.111-07:00In answer to the question, I guess a point of refe...In answer to the question, I guess a point of reference is needed. <br /><br />The ignorance that people point to in surveys is misleading at best and most fail to compare the findings to those of previous years. It's actually improving. However, I what I'd really like to comment on is the Wason 4-card task. <br /><br />It is clear that, in general, people are positively terrible at this task and at conditional reasoning in general. When <i>certain</i> real-world scenarios are used, people generally perform well. However, "perform well" is not the same thing as "reason well". One may come to the correct answer many ways. In this case, people come to it by using pragmatic schemas. That in no way implies that they are rational. <br /><br />Bill Mahr is an atheist. I find this view to be the most rational conclusion. However, he holds a great many views that are, imo, irrational. Who knows how he arrived at the same conclusion I arrived at?badrescherhttps://www.blogger.com/profile/04719915350370585943noreply@blogger.com