Wednesday, June 14, 2006

Volcano seen erupting from space


Astronaut Jeffrey N. Williams aboard the International Space Station was the first to witness the Cleveland Volcano in the Aleutian island chain erupting three weeks ago, and took this nice photo.

Hat tip: The Two Percent Company.

Church of the Computer


Mare Nostrum computer at the Technical University of Catalonia, Barcelona, Spain, housed in a 1920s cathedral.

(Via BLDGBLOG.)

Monday, June 12, 2006

Bennett on Free Press net neutrality "facts"

Richard Bennett at the Original Blog has criticized Free Press's list of network neutrality facts, arguing that most of them are fictions, e.g.:

PSEUDO-FACT #1: Network Neutrality protections have existed for the entire history of the Internet.

REAL FACT: Actually, there is no legal precedent at all for the anti-QoS provision of the Neutrality regulations, and many commercial Internet customers use QoS today. Even the Internet2 Abilene network tried to use it.

This pseudo-fact is one I've repeatedly criticized network neutrality advocates for falsely asserting. The Free Press folks are the people managing the "Save the Internet" campaign.

The whole list is well worth reading, as are a couple of other recent posts at Bennett's blog. One is a note he wrote to Senator Barbara Boxer, which includes this bit:

The Snowe-Dorgan and Markey Amendments contain a poison pill that will stifle the evolution of the Internet, in the form of a prohibition against a Quality of Service surcharge:

If a broadband network provider prioritizes or offers enhanced quality of service to data of a particular type, it must prioritize or offer enhanced quality of service to all data of that type (regardless of the origin or ownership of such data) without imposing a surcharge or other consideration for such prioritization or enhanced quality of service.

The argument in favor of this provision says that it’s needed in order to prevent the formation of a two-tier Internet, where one tier has Quality of Service and the other doesn’t, and this is somehow bad for Daily Kos and Google.

This is a false claim, because the engineering math behind Quality of Service says it can’t be applied to every stream from every user. In Lake Woebegon all the children are above average, but on the Internet all the packets can’t be above average.

Another is his comment on tomorrow's Senate hearing where Ben Scott of the Free Press will be representing the "pro-regulation side of the neutrality debate" which he suggests is "a good choice" because he's "easily confused." In the same post, he reports on Matt Stoller's (of MyDD) presentation at the Yearly Kos event, in which Stoller "had to admit that he knows nothing about the issue of Telecom policy, which was interesting because the regulations he proposes don’t actually relate to telecom policy. They’re a new and unprecedented intervention into Internet routing and service plan regulation, a totally virgin territory for government regulators. Stoller admitted that it’s just a good guys vs. bad guys issue for him, one that’s lots of “fun”."

Bennett asks, "So my question is this: “should the US Congress take advice on virgin regulatory territory from someone who admits to knowing nothing about the subject matter?”"

Verizon's Thomas Tauke on net neutrality

Declan McCullagh interviews Verizon's Thomas Tauke on net neutrality. A key Q&A:
What do you think of the tone of the debate, and the appearance of pro-Net neutrality spokespeople like Moby and Alyssa Milano?
Tauke: I think it's one of the stranger debates I've ever been involved in. It's almost like we're debating what is beauty and how do we define it and regulate it? The problem is that everyone has a different definition of Net neutrality. If you look at the four major companies that are supporting the Net neutrality arguments, there are three distinct definitions of what Net neutrality should mean.

The question becomes which way do you think the market will better develop? If government sets policy today that dictates how the market develops? We think it should develop in the free market space, and government regulation should come in when a problem becomes apparent.

He's right on the money here. Most net neutrality advocates don't even seem to know what they are advocating, let alone understand the current legal or technological structure of the Internet in the U.S. (or elsewhere). They just think the telcos are somehow trying to take control of the Internet, block their access to websites, redirect them to different sites than they request, and intentionally degrade their service to make things slow, and they need to be stopped.

Hat tip: Matt S., the Only Republican in San Francisco.

Martin Geddes on net neutrality, federalism, and U.S. vs. EU

Martin Geddes has written a very interesting post at his Telepocalypse blog titled "You won't like this, not one bit," but I do like it, very much. He links to his past statements on network neutrality, and then asserts that "over time, the architecture of the telecom system will resemble the political system around it." He compares the U.S. government to the EU, and the irony that the planned federalism of the U.S. (where the states would run things their own way, competing with each other and evolving better rules in the process) has been supplanted by much stronger federal government setting most of the rules at a national level, while the EU, composed of nations of a much more collectivist/statist variety, has evolved into a collection of "competing regulatory regimes and voluntary cross-border cooperation compared to the centrally planned US communications economy."

For good measure, he throws in a comparison to networks: "That means the EU constitution is “edge-based”, and the US one doesn’t scale. Oops. Hey, just skip a generation and move straight to anarchism: peer-to-peer contracts, and a state whose only function is to enforce them."

Network security panel in Boston area

I'll be on a breakfast panel for MassNetComms on June 28 in Newton, MA, on this subject:
Secure Carrier Infrastructure in the IP Network
When customers talk to the suppliers of network services, whether VoIP, or broadband, or wireless, their most important requirements are associated with network security. Breaches in network security impact reliability and availability, with devastating revenue and competitive consequences. Service providers who demonstrate cost-effective security at all layers of the network and applications will be able to differentiate their services in an increasingly competitive market. The need for security is creating more demand for outsourced managed services, and thus a business opportunity for the carrier. Are carriers recognizing that security is integral to the value proposition? As the major carriers continue to invest in infrastructure, how does the architecture support network security?
More information about this event at the MassNetComms site.

When private property becomes the commons

While thinking about Jonathan Adler's presentation at the Skeptics Society conference, it occurred to me that the problem of botnets is, in effect, a tragedy of the commons. The private personal computers of consumers which are connected full-time to the Internet and are not kept up-to-date on patches have, in effect, become a commons to be exploited by the botherders. The owners of the computers are generally not aware of what's going on, as the bots generally try to minimize obtrusiveness in order to continue to operate. The actual damages to each individual are typically quite small (with some notable exceptions--botherders can steal and make use of any data on the machine, including personal identity information and confidential documents), and the individual consumer doesn't have sufficient incentive to prevent the problem (say, by spending additional money on security software or taking the time to maintain the system).

Similarly, the typical entry-level casual blogger may not have incentives to keep their blogs free of spam comments. Neither, for that matter, does commons-advocate Larry Lessig, whose blog's comments are full of spam, making them less useful than they otherwise would be--I think this is an amusing irony about Lessig's position in his book Code. He argues that we need to have some subsidized public space on the Internet, but it seems to me private companies have already created it largely without public subsidy, and I think Declan McCullagh has the better case in his exchange with Lessig. (By contrast, Blogger does have incentive to prevent spam blogs, which consume large amounts of its resources and make its service less useful--and so it takes sometimes heavy-handed automated actions to try to shut it down.)

Bruce Schneier has argued that the right way to resolve this particular problem is by setting liability rules to shift incentives to players who can address the issue--e.g., software companies, ISPs, and banks (for phishing, but see this rebuttal). I agree with Schneier on this general point and with the broader point that economics has a lot to teach information security.

"Banner farms" and spyware

Ben Edelman continues his valuable research with an exposure of Hula Direct's "banner farms" which are being used to display banner ads through popups, driven by spyware installations:
Hula cannot write off its spyware-sourced traffic as a mere anomaly or glitch. I have received Hula popups from multiple spyware programs over many months. Throughout that period, I have never arrived at any Hula site in any way other than from spyware -- never as a popup or popunder served on any bona fide web site, in my personal casual web surfing or in my professional examination of web sites and advertising practices. From these facts, I can only conclude that spyware popups are a substantial source of traffic to Hula's sites.
Edelman also notes that most of Hula's ads include JavaScript code or HTML refresh meta tags to automatically reload the ads fairly quickly. The effect is to display more ads, and to show the ads for a shorter time than the advertisers are expecting.

Hula doesn't have a direct relationship with its advertisers (Edelman notes the relationships of cash and traffic flow), but they are being complacent and allowing it to happen. Some of the advertisers: Vonage, Verizon, Circuit City.

Finally, Edelman notes that some of the ad networks being used by Hula have taken notice and started to take action. One ad network, Red McCombs Media, refused to pay a $200,000+ bill from Hula and has been sued by them for breach of contract.

Sunday, June 11, 2006

Adler on federal environmental regulation

At the Skeptics Society conference on "The Environmental Wars," Jonathan Adler gave a talk on "Fables of Federal Environmental Regulation." Adler's talk made several points, the main ones among them being:

* Federal regulations tend to come late to the game, after state and local regulations or private actions have already begun addressing the problems. The recurring pattern is that there is an initial recognition of a problem, there's state and local regulation and private action to address it, and then there's federalization. I can add to Adler's examples the development of the cellular telephone industry, where private actors stepped in to allocate licenses through the "Big Monopoly Game" (a story told in the book Wireless Nation) when the FCC proved incompetent to do so itself; federal anti-spam legislation, which came only after many states passed anti-spam laws; and federal law to require notification of customers whose personal information has been exposed by system compromise (which still doesn't exist, though almost half the states now have some kind of hacking notification law). (In a related point, industries regularly develop products that completely sidestep federal regulations, such as the SUV, interstate banking, credit cards, money market accounts, and discount brokerages. The development of the latter financial products is a story told in Joseph Nocera's A Piece of the Action: How the Middle Class Joined the Money Class.)

* The causes of federal regulations are not necessarily the problems themselves, but are often rent-seeking by involved entities, which can create a barrier to other alternative solutions. Adler listed four causes of federal environmental regulations: increased environmental awareness (by the voters and the feds), increasingly nationalized politics (political action at a national level), distrust of states and federalism, and rent-seeking. He gave examples to illustrate.

* We don't see (I'd say "we tend not to see") environmental problems where we have well-defined property rights; the environmental problems occur in the commons (cf. Garrett Hardin's "The Tragedy of the Commons"). I disagree with making this an absolute statement since there are bad actors who disregard even well-established property rights (or liability rules).

Adler's intent was to raise skepticism about federal regulation on environmental matters on the basis of several points:

* History shows the problem already being addressed effectively in a more decentralized manner.
* Federal regulation tends to preempt state regulation, creating a uniform approach that doesn't allow us the benefits of seeing how different approaches might work--we can miss out on better ways of dealing with the issue.
* The rent-seeking behavior can produce unintended consequences that can make things worse or impose other costs.

While I'm not sure I agree with the implied conclusion that federal regulation is never helpful, I agree that these are good reasons to be skeptical.

The preemption issue in particular is a big one. The federal anti-spam law, CAN-SPAM, was pushed through after years of failure to pass federal regulations against spam after California passed a tough mandatory opt-in law. The federal law was passed largely through efforts by Microsoft and AOL (whose lawyers helped write it) and preempted state laws which mandated opt-in or any requirements contrary to the federal law. I don't think it's cynical to believe that preventing the California law from taking effect--which would potentially have affected online marketing efforts by Microsoft and AOL--was a major cause of the federal legislation passing.

The benefit of preemption is that it creates a level playing field across the entire nation, which reduces the costs of compliance for those who operate across multiple states. But it also reduces the likelihood of innovation in law through experimentation with different approaches, and reduces the advantages of local entities in competition with multi-state entities. It also prevents a state with more stringent requirements from affecting the behavior of a multi-state provider operating in that state, when the requirements get dropped to a federal lowest common denominator. As regulation almost always has unintended consequences, a diversity of approaches provides a way to discover those consequences and make more informed choices.

Another issue is that many federal regulations provide little in the way of enforcement, and the more federal regulations are created, the less likely that any particular one will have enforcement resources devoted to it. If you look at the FCC's enforcement of laws against illegal telemarketing activity (such as the prohibition on prerecorded solicitations to residential telephones, and the prohibition on telemarketing to cell phones), it's virtually nonexistent. They occasionally issue a citation, and very rarely issue fines to telemarketers who are blatantly violating the law on a daily basis. In this particular case, the law creates a private right of action so that the recipient of such an illegal call can file a civil case, and this model is one I'd like to endorse. I've personally had far more effect on most of the specific telemarketers who have made illegal calls to my residence than the FCC has. Federal laws and regulations can be effective when they are applicable to a small number of large players who can be adequately policed by a federal agency (but in such cases those large players tend to also be large players in Washington, D.C., and have huge influence over what rules get set) or when the enforcement is pushed down to state, local, or even private levels (e.g., using property or liability rules rather than agency-based regulation). Otherwise, they tend to be largely symbolic, with enforcement actions only occurring against major offenders while most violations are left unpunished.

The most effective solutions are those which place the incentives on involved parties to voluntarily come to agreements that address the issues, and I think these are possible in most circumstances with the appropriate set of property and liability rules. A good discussion of this subject may be found in David Friedman's book, Law's Order: What Economics Has to Do With Law and Why It Matters.

There seems to be a widespread illusion on the part of many people that many problems can be solved merely by passing the federal legislation, without regard for the actual empirical consequences of such legislation (or the actual process of how it's determined what gets put into such legislation!). From intellectual property law, to environmental law, to telecommunications law (e.g., net neutrality), good intentions can easily lead to bad consequences by those who don't concern themselves with such details. Friedman's book is a good start as an antidote to such thinking.

Saturday, June 10, 2006

George Ou explains QoS to Russell Shaw

In an exchange on ZDNet, George Ou gives a simple explanation of the benefits of QoS for VoIP traffic and why any form of "net neutrality" that prohibits it or requires it to be offered without premium charges is a bad idea:

I’ll say this loud and clear; QoS is a reordering of packets that is an essential part of network traffic engineering. Take the following example where A represents VoIP packets and b represents webpage packets.

No enhanced QoS policy
AbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbAbAbbbbbbAbA

With enhanced QoS policy
AbbbbbbbbbAbbbbbbbbbbAbbbbbbbbbbAbbbbbbbbbbA

Now note that there are only 5 A packets in the entire stream for either scenario and you still get the exact same throughput for the b packets with or without prioritization for the VoIP A packets. The difference is that the A packets are now a lot more uniform which makes sound quality go up and the webpage b packets don’t really care about uniformity since all they care is that they get there at all intact. With this QoS example, you can improve VoIP without affecting the average throughput of web surfing. More precisely, QoS has ZERO throughput effect on non-prioritized when there is zero congestion on the pipe. If it had been a congested network, then QoS will have minimal effect on non-prioritized traffic.

Hat tip to Richard Bennett at the Original Blog.

Also see Dave Siegel on QoS and net neutrality.