The 2009 Hogan and Hartson Jurimetrics Lecture in honor of Lee Loevinger was given on the afternoon of November 5 at Arizona State University's Sandra Day O'Connor School of Law by Robert B. Laughlin. Laughlin, the Ann T. and Robert M. Bass Professor of Physics at Stanford University and winner of the 1998 Nobel Prize in Physics (along with Horst L. Stormer and Daniel C. Tsui), spoke about his recent book, The Crime of Reason.
He began with a one-sentence summary of his talk: "A consequence of entering the information age is probably that we're going to lose a human right that we all thought we had but never did ..." The sentence went on but I couldn't keep up with him in my notes to get it verbatim, and I am not sure I could identify precisely what his thesis was after hearing the entire talk and Q&A session. The main gist, though, was that he thinks that a consequence of allowing manufacturing to go away and being a society based on information is that "Knowledge is dear, therefore there has to be less of it--we must prevent others from knowing what we know, or you can't make a living from it." And, he said, "People who learn on their own are terrorists and thieves," which I think was intentional hyperbole. I think his talk was loaded with overgeneralizations, some of which he retracted or qualified during the Q&A.
It certainly doesn't follow from knowledge being valuable that there must be less of it. Unlike currency, knowledge isn't a fungible commodity, so different bits of knowledge have different value to different people. There are also different kinds of knowledge--know-how vs. knowledge that, and making the latter freely available doesn't necessarily degrade the value of the former, which is why it's possible to have a business model that gives away software for free but makes money from consulting services. Further, the more knowledge there is, the more valuable it is to know where to find the particular bits of knowledge that are useful for a given purpose, and the less it is possible for a single person to be an expert across many domains. An increasing amount of knowledge means there's increasing value in various kinds of specializations, and more opportunities for individuals to develop forms of expertise in niches that aren't already full of experts.
Laughlin said that he is talking about "the human rights issue of the 21st century," that "learnign some things on your own is stealing from people. What we think of as our rights are in conflict with the law, just as slavery is in conflict with human rights." He said that Jefferson was conflicted on this very issue, sayng on the one hand that "knowledge is like fire--divinely designed to be copyable like a lit taper--I can light yours with mine, which in no way diminishes my own." This is the non-rival quality of information, that one person copying information from another doesn't deprive the other of their use of it, though that certainly may have an impact on the commercial market for the first person to sell their information.
"On the other hand," said Laughlin, "economics involves gambling. [Jefferson] favored legalized gambling. Making a living involves bluff and not sharing knowledge." He said that our intellectual property laws derive from English laws that people on the continent "thought ... were outrageous--charging people to know things."
He put up a photo of a fortune from a fortune cookie, that said "The only good is knowledge, and the only evil ignorance." He said this is what you might tell kids in school to get them to study, but there's something not right about it. He then put up a drawing of Dr. Frankenstein and his monster (Laughlin drew most of the slides himself). He said, we're all familiar with the Frankenstein myth. "The problem with open knowledge is that some of it is dangerous. In the U.S. some of it is off-limits, you can't use it in business or even talk about it. It's not what you do with it that's exclusive, but that you have it at all."
His example was atomic bomb secrets and the Atomic Energy Act of 1954, which makes it a federal felony to reveal "nuclear data" to the public, which has been defined very broadly in the courts. It includes numbers and principles of physics.
Laughlin returned to his fortune cookie example, and said there's another problem. He put up a drawing of a poker game. "If I peeked at one guy's cards and told everyone else, the poker game would stop. It involves bluffing, and open access to knowledge stops the game." He suggested that this is what happened last year with the world financial sector--that the "poker game in Wall Street stopped, everyone got afraid to bet, and the government handled it by giving out more chips and saying keep playing, which succeeded." I agree that this was a case where knowledge--specifically knowledge of the growing amounts of "toxic waste" in major world banks--caused things to freeze up, it wasn't the knowledge that was the ultimate cause, it was the fact that banks engaged in incredibly risky behavior that they shouldn't have. More knowledge earlier--and better oversight and regulation--could have prevented the problem.
Laughlin said "Economics is about bluff and secrecy, and open knowledge breaks it." I don't think I agree--what makes markets function is that price serves as a public signal about knowledge. There's always going to be local knowledge that isn't shared, not necessarily because of bluff and secrecy, but simply due to the limits of human capacities and the dynamics of social transactions. While trading on private knowledge can result in huge profits, trading the private knowledge itself can be classified as insider trading and is illegal. (Though perhaps it shouldn't be, since insider trading has the potential for making price signals more accurate more quickly to the public.)
Laughlin showed a painting of the death of Socrates (by Jacques-Louis David, not Laughlin this time), and said that in high school, you study Plato, Aristotle, and Descartes, and learn that knowledge is good. But, "as you get older, you learn there's a class system in knowledge." Plato etc. is classified as good, but working class technical knowledge, like how to build a motor, is not, he claimed. He went on to say, "If you think about it, that's exactly backwards." I'm not sure anyone is ever taught that technical knowledge is not valuable, especially these days, where computer skills seem to be nearly ubiquitous--and I disagree with both extremes. From my personal experience, I think some of my abstract thinking skills that I learned from studying philosophy have been among the most valuable skills I've used in both industry and academia, relevant to both theoretical and practical applications.
Laughlin said that "engines are complicated, and those who would teach you about it don't want to be clear about it. It's sequestered by those who own it, because it's valuable. The stuff we give away in schools isn't valuable, that's why we give it away." In the Q&A, a questioner observed that he can easily obtain all sorts of detailed information about how engines work, and that what makes it difficult to understand is the quantity and detail. Laughlin responded that sometimes the best way to hide things is to put them in plain sight (the Poe "purloined letter" point), as needles in a haystack. But I think that's a rather pat answer to something that is contradictory to his claim--the information really is freely available and easy to find, but the limiting factor is that it takes time to learn the relevant parts to have a full understanding. The limit isn't the availability of the knowledge or that some of it is somehow hidden. I'd also challenge his claim that the knowledge provided in schools is "given away." It's still being paid for, even if it's free to the student, and much of what's being paid for is the know-how of the educator, not just the knowledge-that of the specific facts, as well as special kinds of knowledge-that--the broader frameworks into which individual facts fit.
Laughlin went on to say, "You're going to have to pay to know the valuable information. Technical knowledge will disappear and become unavailable. The stuff you need to make a living is going away." He gave as examples defense-related technologies, computers, and genetics. He said that "people in the university sector are facing more and more intense moral criticism" for sharing information. "How life works--would we want that information to get out? We might want to burn those books. The 20th century was the age of physics, [some of which was] so dangerous we burned the books. It's not in the public domain. The 21st century is the age of biology. We're in the end game of the same thing. In genetics--e.g., how disease organisms work. The genetic structure of Ebola or polio." Here, Laughlin seems to be just wrong. The gene sequences of Ebola and polio have apparently been published (Sanchez, A., et al. (1993) "Sequence analysis of the Ebola virus genome: organization, genetic elements and comparison with the genome of Marburg virus," Virus Research 29, 215-240 and Stanway, G., et al. (1983) "The nucleotide sequence of poliovirus type 3 leon 12 a1b: comparison with poliovirus type 1," Nucleic Acids Res. 11(16), 5629-5643). (I don't claim to be knowledgeable about viruses, in the former case I am relying on the statement that "Sanchez et al (1993) has published the sequence of the complete genome of Ebola virus" from John Crowley and Ted Crusberg, "Ebola and Marburg Virus: Genomic Structure, Comparative and Molecular Biology."; in the latter case it may not be publication of the complete genome but is at least part.)
Laughlin talked about the famous issue of The Progressive magazine which featured an article by Howard Moreland titled "How H-Bombs Work." He showed the cover of the magazine, which read, "The H-Bomb Secret--How we got it--why we're telling it." Laughlin said that the DoJ enjoined the journal from publishing the article and took the issue into secret hearings. The argument was that it was a threat to national security and a violation of the Atomic Energy Act. The judge said that the rule against prior restraint doesn't apply because this is so dangerous that "no jurist in their right mind would put free speech above safety." Laughlin said, "Most people think the Bill of Rights protects you, but this case shows that it doesn't." After the judge forbid publication, it was leaked to a couple of "newspapers on the west coast," after which the DoJ dropped the case and the article was published. According to Laughlin, this was strategy, that he suspects they didn't prosecute the case because the outcome would have been to find the AEA unconstitutional. By dropping the case it kept the AEA as a potential weapon in future cases. He said there have only been two cases of the criminal provisions of the AEA prosecuted in the last 50 years, but it is "inconceivable that it was only violated twice. The country handles its unconstitutionality by not prosecuting." The U.S., he said, is like a weird hybrid of Athens and Sparta, favoring both being open and being war-like and secretive. These two positions have never been reconciled, so we live in an unstable situation that favors both.
He also discussed the case of Wen Ho Lee, a scientist from Taiwan who worked at Los Alamos National Laboratory, who took home items that were classified as "PARD" (protect as restricted data), even though everyone is trained repeatedly that you "Don't take PARD home." When he was caught, Laughlin said, he said "I didn't know it was wrong" and "I thought they were going to fire me, so I took something home to sell." The latter sounds like an admission of guilt. He was put into solitary confinement for a year (actually 9 months) and then the case of 50 counts of AEA violations was dropped. Laughlin characterized this as "extralegal punishment," and said "we abolish due process with respect to nuclear data." (Wen Ho Lee won a $1.5 million settlement from the U.S. government in 2006 before the Supreme Court could hear his case. Somehow, this doesn't seem to me to be a very effective deterrent.)
Laughlin said that we see a tradeoff between risk and benefit, not an absolute danger. The risk of buildings being blown up is low enough to allow diesel fuel and fertilizer to be legal. Bombs from ammonium nitrate and diesel fuel are very easy to make, and our protection isn't hiding technical knowledge, but that people just don't do it. But nuclear weapons are so much more dangerous that the technical details are counted as absolutely dangerous, no amount of benefit could possibly be enough. He said that he's writing a book about energy and "the possible nuclear renaissance unfolding" (as a result of need for non-carbon-emitting energy sources). He says the U.S. and Germany are both struggling with this legal morass around nuclear information. (Is the unavailability of nuclear knowledge really the main or even a significant issue about nuclear plant construction in the United States? General Electric (GE Energy) builds nuclear plants in other countries.)
Laughlin said that long pointy knives could be dangerous, and there's a movement in England to ban them. Everybody deals with technical issue of knowledge and where to draw lines. (Is it really feasible to ban knives, and does such a ban constitute a ban on knowledge? How hard is it to make a knife?)
At this point he moved on to biology, and showed a photograph of a fruit fly with legs for antennae. He said, "so maybe antennae are related to legs, and a switch in development determines which you get. The control machinery is way too complicated to understand right now." (Really?) "What if this was done with a dog, with legs instead of ears. Would the person who did that go to Stockholm? No, they'd probably lose their lab and be vilified. In the life sciences there are boundaries like we see in nuclear--things we shouldn't know." (I doubt that there is a switch that turns dog ears into legs, and this doesn't strike me as plausibly being described as a boundary on knowledge, but rather an ethical boundary on action.) He said, "There are so many things researchers would like to try, but can't, because funders are afraid." Again, I suspect that most of these cases are ethical boundaries about actions rather than knowledge, though of course there are cases where unethical actions might be required to gain certain sorts of knowledge.
He turned to stem cells. He said that the federal government effectively put a 10-year moratorium on stem cell research for ethical reasons. Again, these were putatively ethical reasons regarding treatment of embryos, but the ban was on federally funded research rather than any research at all. It certainly stifled research, but didn't eliminate it.
Next he discussed the "Millennium Digital Copyright Act" (sic). He said that "people who know computers laugh at the absurdity" of claiming that computer programs aren't formulas and are patentable. He said that if he writes a program that "has functionality or purpose similar to someone else's my writing it is a violation of the law." Perhaps in a very narrow case where there's patent protection, yes, but certainly not in general. If he was arguing that computer software patents are a bad idea, I'd agree. He said "Imagine if I reverse-engineered the latest Windows and then published the source code. It would be a violation of law." Yes, in that particular example, but there are lots of cases of legitimate reverse engineering, especially in the information security field. The people who come up with the signatures for anti-virus and intrusion detection and prevention do this routinely, and in some cases have actually released their own patches to Microsoft vulnerabilities because Microsoft was taking too long to do it themselves.
He said of Microsoft Word and PDF formats that they "are constantly morphing" because "if you can understand it you can steal it." But there are legal open source and competing proprietary software solutions that understand both of the formats in question--Open Office, Apple's Pages and Preview, Foxit Reader, etc. Laughlin said, "Intentional bypassing of encryption is a violation of the DMCA." Only if that encryption is circumvention of "a technological measure that effectively controls access to" copyrighted material and the circumvention is not done for the purposes of security research, which has a big exception carved out in the law. Arguably, breakable encryption doesn't "effectively control access," though the law has certainly been used to prosecute people who broke really poor excuses for encryption.
Laughlin put up a slide of the iconic smiley face, and said it has been patented by Unisys. "If you use it a lot, you'll be sued by Unisys." I'm not sure how you could patent an image, and while there are smiley face trademarks that have been used as a revenue source, it's by a company called SmileyWorld, not Unisys.
He returned to biology again, to talk briefly about gene patenting, which he says "galls biologists" but has been upheld by the courts. (Though perhaps not for many years longer, depending on how the Myriad Genetics case turns out.) Natural laws and discoveries aren't supposed to be patentable, so it's an implication of these court decisions that genes "aren't natural laws, but something else." The argument is that isolating them makes them into something different than what they are when they're part of an organism, which somehow constitutes an invention. I think that's a bad argument that could only justify patenting the isolation process, not the sequence.
Laughlin showed a slide of two photos, the cloned dog Snuppy and its mother on the left, and a Microsoft Word Professional box on the right. He said that Snuppy was cloned when he was in Korea, and that most Americans are "unhappy about puppy clones" because they fear the possibility of human clones. I thought he was going to say that he had purchased the Microsoft Word Professional box pictured in Korea at the same time, and that it was counterfeit, copied software (which was prevalent in Korea in past decades, if not still), but he had an entirely different point to make. He said, about the software, "The thing that's illegal is not cloning it. If I give you an altered version, I've tampered with something I'm not supposed to. There's a dichotomy between digital knowledge in living things and what you make, and they're different [in how we treat them?]. But they're manifestly not different. Our legal system['s rules] about protecting these things are therefore confused and mixed up." I think his argument and distinction was rather confused, and he didn't go on to use it in anything he said subsequently. It seems to me that the rules are pretty much on a par between the two cases--copying Microsoft Word Professional and giving it to other people would itself be copyright infringement; transforming it might or might not be a crime depending on what you did. If you turned it into a piece of malware and distributed that, it could be a crime. But if you sufficiently transformed it into something useful that was no longer recognizable as Microsoft Word Professional, that might well be fair use of the copyrighted software. In any case in between, I suspect the only legally actionable offense would be copyright infringement, in which case the wrongdoing is the copying, not the tampering.
He put up a slide of Lady Justice dressed in a clown suit, and said that "When you talk to young people about legal constraints on what they can do, they get angry, like you're getting angry at this image of Lady Law in a clown suit. She's not a law but an image, a logos. ... [It's the] root of our way of relating to each other. When you say logos is a clown, you've besmirched something very fundamental about who you want to be. ... Legal constraints on knowledge is part of the price we've paid for not making things anymore." (Not sure what to say about this.)
He returned to his earlier allusion to slavery. He said that was "a conflict between Judeo-Christian ethics and what you had to do to make a living. It got shakier and shakier until violence erupted. War was the only solution. I don't think that will happen in this case. [The] bigger picture is the same kind of tension. ... Once you make Descartes a joke, then you ask, why stay?" He put up a slide of a drawing of an astronaut on the moon, with the earth in the distance. "Why not go to the moon? What would drive a person off this planet? You'd have to be a lunatic to leave." (I thought he was going to make a moon-luna joke, but he didn't, unless that was it.) "Maybe intellectual freedom might be that thing. It's happened before, when people came to America." He went on to say that some brought their own religious baggage with them to America. Finally, he said that when he presents that moon example to graduate students, he always has many who say "Send me, I want to go."
And that's how his talk ended. I was rather disappointed--it seemed rather disjointed and rambling, and made lots of tendentious claims--it wasn't at all what I expected from a Nobel prizewinner.
The first question in the Q&A was one very much like I would have asked, about how he explains the free and open source software movement. Laughlin's answer was that he was personally a Linux user and has been since 1997, but that students starting software companies are "paranoid about having stuff stolen," and "free things, even in software, are potentially pernicious," and that he pays a price for using open source in that it takes more work to maintain it and he's constantly having to upgrade to deal with things like format changes in PDF and Word. There is certainly such a tradeoff for some open source software, but some of it is just as easy to maintain as commercial software, and there are distributions of Linux that are coming closer to the ease of use of Windows. And of course Mac OS X, based on an open source, FreeBSD-derived operating system, is probably easier for most people to use than Windows.
I think there was a lot of potentially interesting and provocative material in his talk, but it just wasn't formulated into a coherent and persuasive argument. If anyone has read his book, is it more tightly argued?
I haven't read the book, but I had the same impression from his talk at Cato a couple of years ago.
ReplyDeleteHow can he claim that the knowledge learned in schools isn't valuable? It's (for the most part) foundational knowledge required to acquire the more advanced forms of knowledge. And even the stuff that isn't necessarily foundational knowledge (like phys. ed. or sex ed.) is valuable in that it helps keep you healthy (at least in theory) -- and without health it becomes near impossible to profit from any knowledge you might have.
ReplyDeleteI think he might have a point though about academic knowledge being valued over technical or practical knowledge. This was certainly the case when I was in school, and at least still partly true now. Maybe undervalued would be a better word. After all, auto mechanics often earn pitifully low wages for the amount of knowledge and the amount of physical exertion required in their jobs. And the wage trend on entry level CS jobs has been on the downward path since the late '80s (correct me if this is wrong).
There was a lot of interesting stuff mentioned in that talk it seems. A lot of it covered by Neal Stephenson in Cryptonomicon, which while a novel, at least touched on many of these issues and did so well I thought.
I think perhaps he was engaging in frequent hyperbole to be provocative. I skimmed a few bits of his book on Google Books and it is definitely far better organized than his talk, and says more about some of the issues that he barely touched upon in his talk, like how biotech is privatizing the production of biological knowledge and providing pressures to not share data (out of profit motive, not claims that sharing knowledge is morally wrong). He places some blame on Bayh-Dole, the law that gave universities the right to control and profit from intellectual property produced by professors in their employ, even if funded by federal dollars.
ReplyDelete