Andres Martinez asks in today's New York Times Editorial Observer: Will We Remember 2004 as the Year of the Dean Bubble?. He compares the early enthusiasts and financial supporters to the venture capitalists of the "Dot.com" bubble, a period in which now-gone Dean campaign manager Joe Trippi was heavily involved.
The Standard reports that SCO offers $250,000 reward for arrest of Mydoom worm author. Software provider SCO Group is also recommending a Network Associates stand-alone utility targeting the MyDoom worm that is sweeping the Internet, reportedly slowing servers as it mobilizes infected computers for a DDOS attack on SCO scheduled to launch on February 1. Details of SCO's reward offer.
On November 14, 2003, contents of memos obtained from computer files of two United States Senators were shared with and published by the Wall Street Journal and the Washington Times. On November 28, 2003, the Washington Post reported that an official investigation had begun and that Judiciary Committee Chairman Orrin G. Hatch (R-UT) had confirmed that a member of his staff "had improperly accessed some of the documents" and a second former staff member "may also have been involved." The memos, dated from 2001 through 2003, concerned Democratic strategies for opposing judicial nominees of President Bush. See Senate Opens Inquiry Into Leaked Memos.
The incident raises significant questions about circumstances under which one can have a reasonable expectation of privacy, digital security awareness and potential civil or criminal sanctions under existing law. (More ... )
According to a January 23 story in the Los Angeles Times, Senate Sergeant-at-Arms William Pickle has confiscated hard drives, enlisted forensic experts and conducted interviews in an attempt to pin down who accessed the 15 memos in question. Details of the investigation are still emerging.
The Boston Globe reported on January 22 that a hard drive in the office of Senate Majority Leader Bill Frist (R-TN) is reported to be among the material seized by the Sergeant-at-Arms investigation. The Globe also suggests that Senator Frist's chief judicial nominee adviser, Manuel Miranda, may have been involved.
The Globe quoted Miranda as denying any impropriety. "There appears to have been no hacking, no stealing, and no violation of any Senate rule," he told the Globe. "Stealing assumes a property right and there is no property right to a government document. . . . These documents are not covered under the Senate disclosure rule because they are not official business and, to the extent they were disclosed, they were disclosed inadvertently by negligent [Democratic] staff."
Democrats have tended to disagree with the analysis represented by Miranda's statement. Judiciary Chairman Orrin Hatch, (R-UT) was quoted by the Globe as stating that he was "mortified that this improper, unethical and simply unacceptable breach of confidential files may have occurred on my watch."
On Jan. 24, Richard Powelson of the Knoxville News-Sentinel quoted Sen. Patrick Leahy (D-VT) as referring to "cybertheft" of confidential Democratic memoranda.
Sabrina Pacifica points us to Robert Vamosi's piece on ZD Net, "Security breach on Capitol Hill: It's criminal" (Jan. 26, 2004) in which he asserts that the breach is "as wrong as a criminal hacker breaking into a corporation's Web site. If these allegations hold up under investigation, those responsible should be punished just as a criminal would." Vamosi suggests that the incident also points out issues with lax computer security. He contacted Chris Rouland, vice president of Internet Security Systems's X-Force, who observed that like many corporations, the Senate had focused their security efforts on the perimeter with few internal controls, creating what he called a "hard-candy shell with a soft chewy interior."
I'm no expert on the criminal law of computer file access, so I've some ignorant questions I hope our readers can help with via Comments or Trackback:
Q: Under what circumstances would some of the overt acts possibly committed here be regarded as criminal, and under which statutes?
Q: Regardless of legality, what does this say about the security practices in place at the United States Senate and among its staff?
Q: What internal controls on network computer use and on access to the records in question would have resulted in Senators having a greater expectation of privacy as to these politically sensitive files?
Comments or Trackback, please.
Steven Johnson taps self-organization theory to relate the behaviors of slime mold, ants, Sims and WTO protesters in his readable introduction to the emergence aspects of complexity theory, "Emergence: The Connected Lives of Ants, Brains, Cities and Software" (2001). One of the more illuminating works in Wired's Howard Dean Reading List mentioned in an earlier post at this site. (more ... )
"The Myth of the Ant Queen"
Johnson reminds us that despite the title, ant queens are not authority figures. Ant colonies grow and collectively develop solutions to environmental challenges, with no one directing anything. Instead, individual behavior, marked by scent trails that grow stronger with repeated use, reinforces itself through positive feedback. These scent trails create the colony's tangible form of cumulative "memory."
For thousands of years, cities (like ant colonies) have grown and organized themselves into complex arrangements that kept large numbers fed, housed, protected and employed, all without central planning. In cities over the centuries, pathways, hubs and trade-specific marketplaces have emerged from repeated use and persist, again as a form of cumulative memory that guides old and new dwellers to resources. He calls a city a "pattern amplifying machine; its neighborhoods are a way of measuring and expressing the repeated behavior of larger collectivities -- capturing information about group behavior, and sharing that behavior with the group." Johnson, p. 40.
Johnson recommends us to the work of the American scientist Warren Weaver (1894 to 1978), one of the founders of information theory, who divided scientific inquiry into three camps:
1) study of simple systems with few variables
(e.g. physics)
2) study of "disorganized complexity" with a great many variables
(e.g. probability theory)
3) study of "organized complexity" with many interrelated variables
(e.g. complexity theory).
Johnson observes that the study of complexity and emergence have taken off in recent years in part because of the development of the affordable computing power required to study it. Weaver's work influenced Jane Jacobs in her writing of "The Death and Life of the Great American Cities," (1992) Johnson quotes Jacobs' observation that cities have "innate abilities for understanding, communicating, contriving and inventing what is required to combat their difficulties."
For another example, he points to the work of Oliver Selfridge at M.I.T., (one of the founders of Artificial Intelligence studies and author of the classic 1958 paper "Pandemonium") who developed a model to teach a computer to learn pattern recognition, relying on a distributed, bottom-up intelligence. He did so by creating swarms of simple programs ("demons") that reported to higher-level demons that tallied the confidence votes of the swarm. Through a scheme Selfridge described as natural selection, "right" guessing demons were reinforced and "wrong" guessing demons disregarded. Later researchers' experiments combined randomly generated programs with feedback and found that increasingly successful software "emerged" through similar natural selection.
Work of researchers like Weaver, Jacobs and Selfridge combined, cumulated and fed back upon each itself. The study of organized complexity emerged as a distinct school of thought recognized by several Nobel prizes and best selling books like Hofstadter's "Godel, Escher, Bach," (1999) Gleick's "Chaos" (1988) and Waldrop's "Complexity," (1992) as well as popular simulation games like Sim City.
"Street Level"
Johnson tells us about Deborah Gordon's studies of ant colonies that suggest five fundamental principles for building local knowledge into macro-intelligence and adaptability:
* More is different
-- a critical mass of instances is needed to observe emergent behavior
* Ignorance is useful
-- emergent systems get unwieldy if components are complicated
* Encourage random encounters
-- they enable adaptation and discovery
* Look for patterns in the signs
-- pattern detection leads to circulation of meta-information
* Pay attention to your neighbors
-- "local information can lead to global wisdom"
Gordon also learned that to understand ant colonies, one must study them over the decades of their lifespan as their global behavior emerges. "The persistence of the whole over time -- the global behavior that outlasts any of its component parts -- is one of the defining characteristics of complex systems." Johnson, p. 82.
But does not human volition distinquish humans from ants? Johnson contends that human consciousness is limited to a scale that deprives it of the millenial scale of ant-like decision patterns. Thinking of a city as a "superorganism," Johnson says that it triumphs over other social forms because it possesses "a kind of emergent intelligence: an ability to store and retrieve information, to recognize and respond to patterns in human behavior. We contribute to that emergent intelligence, but it is almost impossible for us to perceive that contribution, because our lives unfold on the wrong scale." Johnson, p. 100.
"The Pattern Match"
Johnson extends his discussion to the concept of "learning" as something not limited to organisms with conscious awareness. As one instance of non-conscious learning, he points to an organism's development of antibodies in response to exposure to viruses or bacteria. "Its about altering a system's behavior in response to ... patterns in ways that make the system more successful at whatever goal its pursuing." Johnson, p. 104. Cities learn also, he writes, being "patterns in time" that keep their shapes over centuries. He reminds us of Lewis Mumford (1895 to 1990), who described a city as "a structure specifically equipped to store and transmit the goods of civilization."
Is the World Wide Web also learning? Johnson thinks not. Although it self-organizes like a city into stable macro-shapes, it is not adaptative, due to the lack of natural selection, he writes. It could be made so, he suggests, if hyperlinks were bi-directional instead of one-directional. All emergent systems are built out of reciprocal feedback. "Self-organizing systems use feedback to bootstrap themselves into a more orderly structure." Johnson, p. 121. He points to developing systems (using Alexa as an example) that build connections between websites based on user traffic (much like the scent trails of ants and the paths and ways of an evolving city). Such feedback systems may, Johnson writes, lead to adaptive behavior equatable to learning.
[DKS query: Does the development of bi-directional linking tools, such as Movable Type's Trackback, and the evolution of the self-referential "blogosphere" provide the feedback loop to which Johnson refers as essential for the Web to "learn?"]
"Listening to Feedback"
As an example of the effects of distributed control and positive feedback, Johnson points to The media handling of the 1998 "Naughtygate" scandal about President Bill Clinton and several women including Gennifer Flowers, Monica Lewinsky, Katherine Willey. Years before, Johnson contends, the allegations would probably not have reached the level of national news, and could have disappeared completely. But independent news sources like the Drudge Report had emerged, and CNN's desire to be competitive led it to allow local stations to air stories without network clearance.
Led by Drudge's "outing" of a Newsweek story squashed by its publishers, those independent sources broke the story. Despite attempts by the White House to dismiss it, the report was picked up by others, and the self-reinforcing power of positive feedback made the story *about* the story an item the national media had little choice but to carry. See e.g., Howard Kurtz, "With Time to Fill, Media Stretch Limits," Washington Post (Jan. 24, 1998; Page B01).
Johnson uses this case as an illustration of how positive feedback amplifies its own signal, to the limits of the media to carry it. Negative feedback, on the other hand, enables equilibrium despite changing and unpredictable external conditions. Negative feedback enables a complex system to become adaptive and to achieve homeostasis, a state of self-regulating stability. One can adjust the rules of the feedback system in ways that affect the resulting outcome, in order to foster desired values, suggests Johnson.
"Control Artist"
Supercomputing guru Danny Hillis built a software system that evolved or learned ways to sort numbers, Johnson tells us. However, the efficiency of the system peaked out with repeated iterations, as it hit "false peaks" in the software "fitness landscape." Hillis overcame that obstacle by introducing "predator" routines that constantly threatened the evolved software routines. The higher the sorting programs climbed in efficiency, the more challenging the predators became. The resulting "arms race" produced increasing competent programs as the system functioned more like an environment than an organism. See Kevin Kelly's Out of Control, Chap. 15 for a bit more on Danny Hillis' studies.
"The Mind Readers"
Humans and chimps are among the very few organisms that have evolved an awareness of that other's minds are also aware, Johnson tells us. 150 is the usual practical limit to the number of other minds that one can track, he says. To get over that limit, we have learned to mentally model people in terms of their categorization into neighborhoods, groups, classes or political parties, he suggests.
Though a bottom-up organization can produce brilliant innovation, it is itself (predictably) unpredictable, making it risky as a format in which to operate a business. Some, says Johnson, see the real battle of the near future as between hierarchical forces and decentralized forces. A similar dialectic has been observed between capitalism and the noncapitalism of innovators.
On politics, Johnson offers his views. "In fact, the needs of most progressive movements are uniquely suited to adaptive, self-organizing systems: both have a keen ear for collective wisdom; both are naturally hostile to excessive concentrations of power; and both are friendly to change. For any movement that aims to be truly global in scope, making it almost impossible to rely on centralized power, adaptive self-organization may well be the only road available." Johnson, p. 224.
As examples, he points to the NGO swarms at the World Trade Organization meeting in Seattle in 1999, which some saw as hopelessly unfocused. "What they fail to recognize is that there can be power and intelligence in a swarm, and if you're trying to do battle against a distributed network like global capitalism, you're better off becoming a distributed network yourself." Johnson, p. 226. See also "The Non-Governmental Order: Will NGOs Democratise, or Merely Disrupt, Global Governance?" Economist (Dec. 11-17, 1999).
Control, or lack thereof, is a key sticking point, as Johnson reminds us in the last chapter, when he says that "understanding emergence has always been about giving up control, letting the system govern itself as much as possible, letting it learn from the footprints." Johnson, p. 234.
IDG News reports (seen via The Standard) that DVD CCA, the holder of the rights to the CSS technology, has dropped its complaint against Andrew Bunner and others who published the DeCSS tools that allowed decryption of the DVDs protected by CSS. Their advocate's statement cites an "evolving legal strategy." See: Industry group drops DVD copying case. This case was outlined in a context of legal theory of trade secrets in Unintended Consequences: Institutional Secrets in a Time of "Smart Mobs".
During The Bubble, The Industry Standard was (to me) an indispensible read. It combined just enough technology with just enough economics and business information for my tastes and needs. I noted how its girth swelled with glossy ads as venture capital spewed into what turned out to be a classic mania. When its advertisers went bust and the mag went from fat to skinny, it suffered the same fate. I still miss it.
The URL, even on life support, still drew a healthy attendance, so its owners (trustees?) have allowed it to be the locus of a group blog. To be authored by some of its past pixel-stained wretches, it has enough potential that we should give it an eyeball, a poke in its ribs and see how it looks. Maybe bring it a cup of chicken soup. Jimmy Gutterman just signed off his first shift of a week's tour of duty scribbling, and another is about to be let out of the debtors' dungeon to step up to the virtual typesetting machine. See: Internet Business News from The Industry Standard: What is this?
By the way, they chose Movable Type for their content management system. Excellent taste.
Two central themes come to mind thinking about the sequence of four cases that I sketched in Institutional Secrets in a Time of "Smart Mobs".
Both derive from the Internet's ability to provide cheap, instant and global communication and its even more powerful ability to facilitate self-organization. Some thoughts follow.
(more ... )
The Discipline of the Free Press
At the time of the Pentagon Papers, the Internet was an immature system used by a handful of rocket scientists; the World Wide Web did not exist. The ability to globally publish something like the Pentagon Papers resided in a few centrally managed institutions like the New York Times or the U.S. Government Printing Office. While the Constitution of the United States protects the freedom of the press, that protection has limits, and when the Times began publishing articles based upon the Papers, the government sought to assert those limits and stop the Times. The Times won that court battle, with minimal delay, thanks to the readiness of a majority of the Justices of the Supreme Court to take jurisdiction and make a decision.
What if it had lost? Would the publishers of the Times have chosen to defy the federal government, disregard a definitive decision of the United States Supreme Court, and proceed to publish information from documents labelled "Secret" by the United States Government?
Such a radical step would likely lead to an escalation of contempt citations, arrests of senior management, even seizure of the presses by federal marshals enforcing court orders. Not a choice likely to be supported by a board of directors responsible to public shareholders.
The University of California faced similar realities when considering its options with the Cigarette Papers. An arm of the state, the University had been preparing to publish the Papers on compact disk and over the Internet, but once sued by Brown & Williamson, shut off public access pending the decision of the courts. Like the Times, they were allowed to publish.
What if the ruling had gone the other way? Had the University's management any real choice to defy the court and publish the papers anyway?
If it had, one might expect a scenario like that of August 2003, when Alabama Chief Justice Roy Moore refused to comply with a federal court order to remove a stone monument of the Ten Commandments from the rotunda of the Alabama Supreme Court building. After a brief moment in the limelight, the Chief Justice was suspended and the court order was carried out by his successor.
In both cases, publication was dependent upon a centrally managed institution that was in turn dependent upon the tacit approval of the government. That dependency assured that the Times and the University (and others similarly situated) would act in a manner most would characterize as "disciplined and responsible to the rule of law."
De-Centralization of the Free Press
By the close of the Millenium, the publishing environment was rapidly changing in dramatic ways. Access to the World Wide Web was fast, cheap and ubiquitous, particularly among students using high-speed university connections. Those young people had grown up publishing and collaborating on the Web, through email, personal web pages, peer-to-peer "instant messaging," online bulletin boards and multi-player games. Despite stern warnings against piracy, they were making and sharing unauthorized copies of music, videos and software in increasing quantities, reciting the mantra that "information wants to be free."
This peer-to-peer publishing capacity and ubiquitous access combine to add a channel around the centrally managed institutions that were once necessary to achieve widespread global distribution of information. The DeCSS code, for example, could be quickly and easily spread through individual websites or peer-to-peer file exchanges, virtually free. Because there was no central institutional "choke point," shutting down continued publication became a much more difficult challenge for intellectual property rights holders and authorities.
In the case of institutional secrets, such as the elements of the CSS code, the peer-to-peer publishing capacity had especially powerful effects. The widespread publication of the information, by compromising the essential claim of "secrecy," damaged the holders' claim to trade secret protection, as Justice Moreno argued in his concurring opinion in the Bunner case.
This same fast, cheap and global publication capability was tapped by the students that came upon Diebold's compromising email records. However, they found that their publication ability was dependent upon the cooperation of the universities that operated the infrastructure. Presented with take down notices based upon the DMCA, the universities' reaction was predictable and just what the DMCA was designed to accomplish: the shut down of students websites that published the Diebold records.
Emergence of a Self-Organized Network
It was at this point another the other interesting thing happened: the emergence of a self-organized network that swiftly distributed copies of the resources to multiple servers faster than Diebold could discover and shut them down.
A pre-existing group of activist students centered at Swarthmore became a key hub of the network. The group called "Why War?" had been formed shortly after the United States invasion of Afghanistan in 2001 and focused upon distribution of information favoring non-violent efforts for peace and justice. It found appeal among various student and activists groups and developed a following among subscribers to its email list and RSS feed.
At their website, the founders of Why War? advocate applying the principles of swarming and "flash mobs" to political expression. An August 2003 essay by one of its founders, Micah White, "Swarming and the Future of Protesting," (2003) proposes practical implementations of the concepts in Howard Rheingold's "Smart Mobs" and RAND Corporation's John Arquilla and David Ronfeldt's studies of "Netwar" tactics. As they explored theoretical applications, an actual case study emerged on their doorstep.
In October of 2003, a few weeks after the Diebold Emails emerged onto the Internet, WhyWar? published them on its website. Its host, Swarthmore, was soon faced with DMCA take down notices from Diebold. Swarthmore decided to comply with the take down notices, even threatening Internet termination for students that included links to sites carrying the Diebold materials. WhyWar? and the Swarthmore Coalition for the Digital Commons responded by announcing a campaign of civil disobedience. Within hours, the story was picked up by Wired News: ("Students Fight E-Vote Firm," Oct. 21, 2003) and by Slashdot ("Swarthmore Students Keep Diebold Memos Online," Oct. 21, 2003).
The WhyWar? site was promptly "Slashdotted," or swamped with traffic from Slashdot readers. Some of those visitors copied Diebold files to their own local archives and passed word about the controversy in their own weblogs, discussion forums and email list servers. The resulting positive feedback loop raised the visibility of the controversy in traditional media and increased the number of copies of Diebold records loose on the Web. Such effects are a common result of a mention of a new online resource at Slashdot.org. See "Slashdotted: Surviving the Slashdot Effect," Geek.com, Oct. 17, 2002, and Stephen Adler, "The Slashdot Effect: An Analysis of Three Internet Publications," (undated, circa Feb. 1999).
An Organic Response to Threat
EFF's John Gilmore is credited with first saying that "the Internet interprets censorship as damage and routes around it." Gilmore appears to have been speaking about the Internet in the sense of its infrastructure of lines and routers. In the Diebold/WhyWar? conflict we observe an effect that is more organic, more like an immune response to a viral infection or the transplantation of foreign tissue.
In an organism's immune response, a foreign object or tissue is detected by individual cells at the locus of entry. Those local cells release chemical signals which migrate through the organism, triggering the release of a swarm of antibodies that find and counteract the threat.
In the case of Diebold and WhyWar?, the threat was an attempt to suppress the publication of certain information. Word of the attempt spread through the network, and other network users swarmed the locus of the threat, counteracting it by carrying off copies for preservation elsewhere. Although encouraged by WhyWar? people, this response emerged without central organization or direction, as the result of many individuals independently reacting to the threat.
As such, it appears to invoke a more complicated order of network response, higher than the physical infrastructure level of circuits and routers. These patterns of response, of emergent order, are the continuing subject of the evolving science of networks and complex, self-organizing systems. It is to these sciences that we may look for ideas on how holders of intellectual property or institutional secrets might (or might not) succeed in countering such global, decentralized and organic behaviors.
One of the authors of The Cluetrain Manifesto provides his selection of thoughts and research from primary and secondary sources on the developing influence of mobile network technology upon society. He speculates about the emergence of self-organized mobile communication networks that may disrupt established economic and political structures. This work from the Howard Dean Reading List is thought provoking and readable, pointing the reader to some seminal original work as well as Rheingold's own predictions. Howard Rheingold, Smart Mobs: The Next Social Revolution (2002). See also the book summary and bibliography at the Smart Mobs website and current information and views on the Smart Mobs Weblog.
(more ... )
"Technologies of Cooperation"
One area of research mentioned is that into the evolution of cooperation among humans. Studies found that cooperation is most likely to emerge within small groups that interact repeatedly; the ability of the players to monitor behavior leads to the development of reputations and peer sanctions upon uncooperative players. The cumulative selection of cooperative behavior results into it being "hard wired" into our evolutionary makeup, according to one theory of the evolution of social behavior. For further study of the role of cumulative selection in evolutionary theory, Rheingold refers us to the works of Oxford zoologist and evolutionist Richard Dawkins, known for The Selfish Gene (1990) and The Blind Watchmaker (1986 - reissued 1996) (also on the Howard Dean Reading List).
Another reference is to game theory experiments with playing the "Prisoners' Dilemma" repeatedly with the same people. Researchers dubbed the consistently most successful strategy "Tit for Tat": start out cooperating and thereafter do whatever the other player did on the last round (cheat if he had cheated, or if he cooperated, be forgiving and cooperate until he cheated again). Repeated over time, this strategy tended to increase the total incidence of cooperation, thereby optimizing the net outcome for the players as a group.
He points to Robert Axelrod's book The Evolution of Cooperation (1985) for studies evidencing how this strategy can naturally propagate throughout a larger population: "Within a pool of entirely uncooperative strategies, cooperative strategies evolve from small clusters of individuals who reciprocate cooperation, even if the cooperative strategies have only a small proportion of their interactions with each other. Clusters of cooperators amass points for themselves faster than defectors can." Axelrod (1985).
Axelrod's theory, according to Rheingold, is that the early clusters of cooperators came to recognize each other and evolved into the earliest form of social organization: tribes. Rheingold finds the classic tribal characteristics of reciprocity, cooperation and respect for reputation also appear among those virtual tribes that collaborated to create the Internet and open source software like Linux.
Rheingold introduces us to Reed's Law, a corollary of Sarnoff's Law (the utility of a broadcast network is proportional to the number of viewers) and Metcalfe's Law (the utility of an interactive network is proportional to the square of the number of nodes). David P. Reed's theory is that the value of a 'group forming network ("GFN") increases as the exponent of the number of nodes. For a readable presentation of these three laws not found in Rheingold's book, see: David P. Reed,
Context Spring 1999 Issue -- Digital Strategy: Weapon of Math Destruction and his exposition of the math and logic behind them in David P. Reed, Context Spring 1999 Issue -- Digital Strategy Supplement: Reed's Law
"Computational Nations and Swarm Supercomputers"
Rheingold views the power of peer-to-peer computing as "a human social power, not a mechanical one." and points us to the work edited by Andy Oram, Peer-to-Peer, Harnessing the Power of Disruptive Technology, (2001). Rheingold notes that Napster's central server was designed so that the system would be commercially viable, but that design also made it vulnerable to targeted legal attack, while Gnutella was designed without a central server, but is challenged by problem of free riders.
"The Evolution of Reputation."
Rheingold shares with us the views of sociologist Paul Resnick, who sees that for reputation systems to function well as a reinforcer of cooperation, both transactors' identities and feedback about past interactions must be persistent, and transactors must afford reputation ratings respect when choosing potential counterparts in transactions. He points to other sociologists who found that people will be more altruistic than predicted by rational self-interest, but will also tend to penalize cheaters, even at their own expense. Combined, these elements of "reciprocal altruism" tend to improve the survival chances of those tribes that exhibit such behavior.
See Paul Resnick's "Sabbatical Musings," a weblog.
"Smart Mobs: The Power of the Mobile Many"
Rheingold points us to the work of John Arquilla and David Ronfeldt, RAND Corporation scholars that focus on the emergence of decentralized, self-organized network forms of organization to maneuver, exploit and dominate established institutions. Arquilla and Ronfeldt concentrate on developing theories of "Netwar," which they define as "an emerging mode of conflict in which the protagonists -- ranging from terrorist and criminal organizations on the dark side, to militant social activists on the bright side -- use network forms of organization, doctrine, strategy and technology attuned to the information age. * * * What all have in common is that they operate in small, dispersed units that can deploy nimbly -- anywhere, anytime."
Reference: Arquilla & Ronfeldt, "Networks, Netwars and the Fight for the Future," First Monday (March 2002). See also: Steven Johnson, Emergence (2001)
"Always-On Panopticon ... or Cooperation Amplifier?"
In 1791, Jeremy Bentham introduced the concept of a prison specially designed so that guards could see all prisoners at all times, without themselves being observed. The prisoners, not knowing when they were under observation, had to assume that they were always observed, enforcing obedience with a minimum of resources expended on actual prisoner observation. Michael Foucault extended the Panopticon concept to theorize that the power derived from this asymmetric knowledge naturally imposes discipline: regularity on behavior and relationships. Rheingold poses the question whether such discipline can evolve as does cooperation.
He points to Non-Zero, The Logic of Human Destiny, in which Robert Wright proposes that new technologies create new chances for humans to engage in win-win (cooperative) transactions, and that social change results when people maneuver to attain the benefits of such transactions. He quotes Wright's observation that "new information technologies in general * * * very often decentralize power, and this fact is not graciously conceded the powers that be. Hence a certain amount of history's turbulence, including some in the current era." Ibid, p. 154.
Of the works in the Howard Dean Reading List compiled by Wired, I found Rheingold's work less substantial and more speculative than Kelly's Out of Control, but potentially more attractive to the general reader interested in the developoment of socially cooperative behavior and the potential effects of mobile communications technology on post-modern society.
The Yale Information Society Project hosts a conference, "Digital Cops in Virtual Environment - CyberCrime and Digital Law Enforcement Conference," at Yale Law School March 26-28, 2004. They expect that the conference "will bring together policy makers, security experts, law enforcement personnel, social activists and academics to discuss the emerging phenomena of cybercrime and law enforcement. The conference will question both the efficacy of fighting cybercrime and the civil liberties implications arising from innovations in law enforcement methods of operation."
They have issued a call for papers, to be evaluated competitively, with the authors of the two best papers to be invited to present at the conference as the guests of the Project. Other selected papers will be published subsequent to the conference.
Ross Anderson is a Reader in Security Engineering at Cambridge University. He maintains a collection of links to papers and other resources on the subject of Economics and Computer Security, including Peer-to-Peer issues and Trusted Computing.
"More and more people are realising that information insecurity is often due to perverse incentives rather than to the lack of technical protection mechanisms. " -- Ross Anderson, about the Economics and Security Resource Page
Technology and constitutional rights intersect in ways that challenge traditional expectations of the security of government and corporate secrets. Networks allow "smart mobs" of individuals to share information in ways that are fast, cheap and out of control. Those with an agenda can today extract and publish institutional secrets in ways that may be hard to detect, let alone remedy with money damages or sanctions.
Q: Where should government and business look for protection? Tighter information security? Tough criminal laws and prosecutions? Prayer?
Q: What is the future of institutional secrets? (More ... )
The Pentagon Papers
In 1971, someone leaked to the New York Times the "Pentagon Papers," a top-secret DOD study critical of the United States involvement in Vietnam. The Times began publishing articles based on the contents, but the United States sued to block any further publication of the Top Secret material. The case quickly reached the Supreme Court, which decided that the damage to national security from publication did not outweigh the damage to the freedom of the press from prior restraint. The Court's ruling is a landmark decision on the balance between secrecy and free press. New York Times Co. v. United States, 403 U.S. 713 (1971)
The nine separate opinions (6 to 3 in favor of the defendants) left open the possibility of criminal prosecution of the Times and the Post after publication, but found insufficient basis for prior restraint. Of the majority, two maintained that the First Amendment provided no room for any prior restraint of the press. The other four conceded that prior restraint could be justified in proper cases, some noting as exceptions publication of troop movements in time of war, prohibitions of obscenity, and restraints on publication of material in violation of copyright, which protects the form of expression rather than the ideas expressed. Ibid. Members of the Court have since written about the importance of the Pentagon Papers to a matter of significant public debate as an essential element in the outcome. Bartnicki v. Vopper, 532 U.S. 514 (2001).
The Cigarette Papers
In 1994, thousands of pages of internal documents previously kept secret by Brown & Williamson Tobacco Corporation were anonymously delivered to the University of California, San Francisco (UCSF) with portions also sent to Congressman Henry Waxman and the New York Times. In a related case, District Court Judge Harold Greene characterized the documents as possible evidence "supporting a 'whistle-blower's' claim that the tobacco company concealed from its customers and the American public the truth regarding the health hazards of tobacco products." Maddox v Williams 855 F.Supp. 406, at 414, 415 (D.D.C. 1994).
The university library added the papers to its tobacco industry archives, made them available to the public and prepared to publish scanned copies electronically. They quickly became a popular source for anti-tobacco litigators and activists, through whom B&W learned of UCSF's collection in January 1995. B&W demanded that UCSF return the documents, deny public access and disclose the names of all who had seen them, and sued when UCSF refused.
The California Superior Court allowed UCSF to keep the documents and publish them. In its opinion, it emphasized the lack of evidence that UCSF was involved in any wrongful taking of the documents, the fact that they were not being introduced into evidence against B&W, the strong public interest in the documents' relevance to public health issues, and the futility of action against UCSF when the documents were widely available elsewhere. The California Supreme Court let the decision stand and on June 30, 1995, the University began publishing the documents on the Web. A year later, both a hard-bound and an online analysis of the collection was published by University of California Press.
The DeCSS Algorithm
In 1999, "DVD Jon" Johanson, a Norwegian, reverse-engineered the proprietary technology inside the Content Scrambling System (CSS) used by the DVD Content Control Association (DVDCCA) to protect commercial DVDs from unauthorized copying. He wrote an algorithm that enabled unscrambling of the disks, and called it "DeCSS." He posted it on the Web and copies spread rapidly. When it learned about DeCSS, the DVDCCA demanded that operators of websites where it appeared take it down and sued those that refused, including Andrew Bunner.
In California, the Superior Court hearing the case against Bunner found that DVDCCA had a likely case for Bunner's violation of California's trade secret act (a version of the Uniform Trade Secrets Act or "UTSA"). The Court entered a preliminary injunction against Bunner using or disclosing DeCSS or linking to sites that disclosed it. The Court of Appeals found that DeCSS was "pure speech" and overturned the injunction as a First Amendment violation. In August of 2003, the California Supreme Court reversed the appellate decision and sent the case back for further proceedings.
The California high court agreed that DeCSS qualified as speech protected by the First Amendment, but ruled that an injunction could be justified by an actual trade secrets act violation. They balanced the governmental interests served by the trade secrets act against the magnitude of the speech restriction that would result from an injunction. They found that publishing the DeCSS code was not necessary to debate on a matter of substantial public interest such as that which characterized the 2001 decision in Bartnicki v. Vopper, 532 U.S. 514 (2001).
Justice Moreno, in his concurring opinion, agreed that an injunction could be justified in a proper protected speech case, but maintained that this was not one of them. He argued that the record was clear that DVDCCA had no case against Bunner because DeCSS was so widely disseminated that it was no longer actually "secret." As a result, he wrote, it was not protected by the trade secrets act, making an injunction an unlawful prior restraint on speech.
The case was sent back to the Superior Court for further consideration of the merits of the claim that the CSS technology is a protected secret and that Bunner's publication of DeCSS violated the trade secrets act. DVD Copy Control Association, Inc. v. Bunner, 31 Cal. 4th 864 (2003). The DeCSS code continues to be widely available on the Internet at international locations easily discoverable through use of a common search engine.
In December, a Norwegian appeals court affirmed the acquital of "DVD Jon" Johanson on criminal charges of breaking the CSS copy protection on DVDs he bought. The charges were brought by the Norwegian Economic Crime Unit (ŘKOKRIM) under Norwegian Criminal Code 145(2), upon the complaint of DVDCCA and the Motion Picture Association of America (MPAA). The Norwegian court ruled that his action was legal under Norwegian law. An earlier acquittal had been appealed by the government. See "Legal victory for 'DVD hacker," BBC News 12/22/03, and " and "DVD-Jon" Defeats Hollywood: Consumer Rights Upheld in Norway," IP Justice.
The E-Voting E-Mails
In August of 2003, electronic copies of thousands of internal e-mail messages between employees and contractors of Diebold, Inc. were posted on a publicly accessible website. The e-mails indicated Diebold had knowledge of security flaws and regulatory violations involving its electronic voting software widely used by state and local governments. News of the material spread, and other websites hyperlinked to the compilation, including student websites hosted on university computers.
When Diebold learned of the material, its lawyers issued "takedown" notices to ISPs that were hosting the material or sites linking to it, triggering the "safe harbor" provisions of 17 U.S.C. §512, the Digital Millenium Copyright Act (DMCA). Universities responded by directing students to take down the material and links, and cut off Internet access of students who failed to comply. Rather than filing the counter-notifications provided for in the DMCA, students moved the files to other sites and urged others to copy the files and "mirror" the archive on multiple hosts elsewhere, which many did. At Swarthmore, an activist group's website tracked the efforts of Diebold, the reactions of universities and students, and the spread of mirror sites throughout the Internet.
The national media picked up the story in October, widely publicizing the existence of the documents and Diebold's attempts to discourage their publication. In November, the website of Congressman Denis Kucinich posted excerpts from the material, adding a public scolding of Diebold and a call for a Congressional investigation into its actions.
Online Policy Group (OPG), an ISP to which Diebold sent a takedown notice, filed a federal lawsuit to enjoin Diebold from further efforts to discourage publication. OPG got legal represetation from the Electronic Frontier Foundation (EFF) and the Center for Internet and Society Cyberlaw Clinic at Stanford Law School. In a November 17 procedural hearing on a motion for a preliminary injunction, the District Court judge's questions focused on the public interest in information about voting systems and First Amendment issues raised by the controversy.
Diebold seemed to realize that the harder it pushed, the more adverse publicity it got and the further the information spread. On November 24, it advised the District Court that it would not sue those hosting copies of the materials for copyright infringement and was withdrawing its DMCA notices, as an indication of its commitment to an open discussion of "helping America vote better."
http://www.eff.org/Legal/ISP_liability/OPG_v_Diebold/DieboldResponse.pdf.
_____________
Some sources:
The Pentagon Papers
The Cigarette Papers
The DeCSS Algorithm
The E-Voting E-Mails
Business Week Online presents insights into the business issues of Voice Over Internet Protocol (VOIP) ... as the big telcos start to offer VOIP services, some analysts ask if they are serious about the technology, or just trying to co-opt the newcomers.
Clayton Christensen, in "The Innovator's Solution," warns about attempting to build a start up on a technology that is disruptive for some, but not all, of the major players in the established industry. Are the VOIP "monoculture" start ups doomed to be crushed by telecos who see VOIP as a sustaining technology?
Is this a "Strategic Inflection Point" ... "where the old strategic picture dissolves and gives way to the new, allowing the business to ascend to new heights" as Andy Grove defined it in "Only the Paranoid Survive"? Are there telecos able to act like Intel has for many years, and cannibalize its own business to survive the onslaught of others, as Andy Grove advised in his book?
Just some of the questions that come to mind reading Finally, 21st Century Phone Service
The earliest title on The Howard Dean Reading List, Kevin Kelly's annotated update on the state of cybernetic research, Out of Control was first published in 1994, shortly after the public emergence of the World Wide Web (I read that edition, a 1995 edition has been published). Yet Kelly writes about our body of knowledge and science as "a web of ideas pointing to, and reciprocally educating each other. Hypertext and electronic writing accelerate that reciprocity." He presents schools of thought that life itself, as well as evolution and organizational behavior are predictable consequences of networks of connected interactors in a variable environment, moving us toward 'network culture' and democratic forms of society. 25 pages of annotated bibliography (itself a worthwhile read) provide for further study. (More ... )
The central focus of the book is an educated layman's view of the science of how complex systems (such as life itself) spontaneously arise from disorder and grow steadily more rich and complex. How simple organisms, processes, even bits of computer code, can self-organize themselves from raw materials and then grow more complex as a result of their web of interaction with each other and a variable environment. How the computer simulations of Stuart Kauffman reveal that complicated networks routinely produce instances of spontaneous order and "strange loops" in which a sequence of effects loop back and become their own remote causes, in effect "crystallizing" life in clumps of stable interactions.
Physicist Erwin Schrödinger called this inherent force for life and organization "negentropy" to contrast it with "entropy," the measure of the effect of the Second Law of Thermodynamics, that all order tends to decay over time. These behaviors follow laws (as yet unclear to us) as strict as those governing light, according to Freeman Dyson's 1988 book Infinite in All Directions (out of print, to be republished 8/04), referenced by Kelly.
According to the research referenced, these complex systems exhibit the logic of biology, as they self-organize and emerge from successive layering of large numbers of simple systems that interact with each other. Such "vivisystems" exhibit four recurring facets of distributed being: 1) the absence of central control, 2) autonomy of the subunits, 3) high connectivity between the subunits, 4) a "webby" nonlinear causality of peers influencing peers.
Kelly looks also at "postDarwinist" schools of evolution, some of whom conclude that evolution is primarily a science of probability and that non-biological systems can and do evolve. It appears to many of those scholars that evolution is a natural property that emerges from a community of dynamic interactors (whether or not biological) in a variable environment. As the various interactors "co-evolve," each affecting the other's evolution, they generate what James Lovelock called a "persistent state of disequilibrium," the mark of life and evolving systems.
These post-Darwinist schools explain the discontinuous nature of evolution, and see it as happening in jumps ("saltatiously"). These jumps result because the nature of evolving systems is to differentiate into persistent clumps (Kauffman might say "crystals") from which further incremental change is difficult, sometimes impossible. Kelly suggests these characteristics apply also to "organic" but non-biological entities such as an economic firm or nation-state that is "severely limited in the directions and ways it can evolve, because it is a hierarchy composed entirely of subentities, which are also limited in their room for adaptation ... ." Kelly, p. 381.
"In the Network Era -- that age we have just entered -- dense communication is creating artificial worlds ripe for emergent coevolution, spontaneous self-organization, and win-win cooperation. In this Era, openness wins, central control is lost, and stability is a state of perpetual almost-falling ensured by constant error." Kelly p. 90. He sees this culminating in a transition from a hierarchical social order to a 'network culture,' in which democracy is "an unavoidable self-organizing strong attractor," as long as "ideas are free to flow and generate new ideas." Kelly p. 396.
Kelly takes this all together for some editorial conclusions on the costs and benefits of network forms of organization. Benefits that Kelly sees include adaptability, evolvability, resilience, boundlessness and novelty. "The only organization capable of unprejudiced growth, or unguided learning," Kelly writes, "is a network. All other topologies limit what can happen." Kelly p. 26. The downside is that network organizations are non-optimal, non-controllable, non-predictable, non-understandable and non-immediate.
"A decentralized, redundant organization can flex without distorting its function, and thus it can adapt," writes Kelly. "It can manage change. We call that growth." Kelly, p. 448. "But we cannot import evolution and learning without exporting control. * * * There is no control outside of a self-making system. Vivisystems, such as economies, ecologies and human culture, can hardly be controlled from any position. They can be prodded, perturbed, cajoled, herded, and at best, coordinated from within." Kelly pp. 448-449.
"The chief psychological chore of the 21st Century," writes Kelly, is "letting go, with dignity."
Kevin Kelly founded the WELL, was Founder and Publisher of the Whole Earth Review, and for many years served as Executive Editor of Wired, where he continues today as "Editor at Large." Out of Control is published by Addison-Wesley (1994).
In Wired 12.01: How the Internet Invented Howard Dean, Gary Wolf looks at the ingredients of the success of Howard Dean 's campaign. "The power of Dean's campaign does not come from his appeal to Net users as an interest group," writes Wolf in January's Wired "but from a fateful concurrence of other forces: a strong antiwar message; a vivid, individualist candidate; a lucky head start with Meetup; an Internet-savvy campaign manager in Joe Trippi; and, most important, a willingness to let a decentralized network of supporters play a tactical role."
Wolf reviews "five Internet maxims" as elements of the campaign success, and includes a "Howard Dean Reading List" of books about social networking as a bibliography on the network science involved. A side bar briefs you on Joe Trippi's background, with one foot in politics and one in Silicon Valley. (more ... )
The five maxims, which the article explains:
The Howard Dean Reading List
The whole article is online free now at Wired 12.01
In August 2003 the California Supreme Court resolved a conflict between the status of DeCSS software as protected speech and the status of trade secrets as protected property, applying the "Madsen" standard of scrutiny of speech restrictions proposed as a remedy for actual trade secret misappropriations. It found no application of the Bartnicki decision, which it read as limited to speech addressing matters of more substantial public concern than those involved in the publication of DeCSS. It reversed a Court of Appeals ruling that an injunction on use and republication of DeCSS was an unconstitutional prior restraint, and remanded for appellate review of the factual determinations in light of these principles.
Judge Moreno's concurrence called for a higher threshold of proof of actual trade secret misappropriation before allowing such restrictions on speech, cast doubt that DeCSS still actually qualified as a secret, and challenged the efficacy of license agreements that attempt to override statutory shelters for "reverse engineering."
DVD Copy Control Association, Inc. v. Bunner, 31 Cal. 4th 864 (2003).
Procedural Status
The controversy arose from the Internet publication of DeCSS, a program that decrypts content on DVDs secured with the Content Scrambling System ("CSS"). The program was developed by a licensee of CSS that reverse engineered its proprietary technology despite a provision in the license forbidding such reverse engineering. The program was widely republished on the Internet, one of its republishers being Mr. Bunner.
When DVD CCA discovered the decryption tool was at large, it filed legal action alleging trade secret misappropriation in violation of the California version of the Uniform Trade Secrets Act (UTSA), Cal. Civil Code §3426 et seq. The District Court found that DVD CCA was likely to prevail on the merits and issued a preliminary injunction against the defendants (one of whom was Mr. Bunner) forbidding use, copying, or distribution of DVD CCA's trade secrets generally and DeCSS specifically.
The Court of Appeals assumed as true the trial court finding that a trade secret misappropriation had occurred but found that the injunction constituted a prior restraint of "pure speech" and violated the First Amendment.
The California Supreme Court also assumed that a trade secret misappropriation had occurred, but reversed the Court of Appeals and remanded for further consideration, holding that the First Amendment did not preclude a preliminary injunction on the assumed facts.
The Court's Analysis
First, the Court acknowledged that computer code such as DeCSS qualified as speech protected by the First Amendment, citing, with discussion:
* Junger v. Daley, 209 F.3d 481 (6th Cir. 2000)
* Universal City Studios, Inc. v. Reimerdes, 111 F.Supp.2d 294 (S.D.N.Y. 2000)
* Universal City Studios, Inc. v. Corley, 273 F.3d 429 (2nd Cir. 2001)
* United States v. Elcom, Ltd., 203 F.Supp.2d 1111 (N.D.Cal. 2002)
Second, the Court found that the injunction was "content neutral" because its principal purpose was protection of a statutory property interest, with only incidental impact upon the content of the enjoined communication, citing, with discussion:
* Ruckelshaus v. Monsanto Co., 467 U.S. 986 (1984)
* Kewanee Oil Co. v. Bicron Corp., 416 U.S. 470 (1974)
* San Francisco Arts & Athletics, Inc. v. United States Olympic Comm., 483 U.S. 522 (1987)
As a "content neutral" sanction, the Court applied a lower level of scrutiny prescribed by Madsen v. Women's Health Center, 512 U.S. 753 (1994), under which the Court ust ask "whether the challenged provisions of the injunction burden no more speech than necessary to serve a significant government interest." Madsen, 512 U.S. at 765.
The Court distinguished the decision in Bartnicki v. Vopper, 532 U.S. 514 (2001), noting that "five justices in Bartnicki endorsed the application of a lesser standard even though the statute arguably prohibited 'pure speech.'" The California Supreme Court favored Rodney A. Smolla's analysis of Bartnicki in "Information as Contraband: The First Amendment and Liability for Trafficking in Speech" 96 Nw. U.L.Rev. 1099 (2002). In that article, Smolla makes the intriguing argument that the true holding of Bartnicki was expressed not by Justice Stevens' majority opinion, but by the combined opinions of the two concurring and the three dissenting justices in that case.
Bunner provides useful guidance about what might qualify as specially protected speech on matters of public concern. It does so through its discussion of why Mr. Bunner's situation does not qualify. The Court applied the test found in Connick v. Meyers, 461 U.S. 138 (1983) and examined the following elements of the content, form and context of the statements in question.
* The information in question was not publicly available and conveyed only technical information.
* DeCSS was not posted in order to comment on a public issue or to participate in a public debate. With the exception of a few "encryption enthusiasts," as the Court called them, the public would be interested only in the use of DeCSS, not the content of its code.
* The public debate over the use of encryption and copy protection of DVDs does not require disclosure of DeCSS, and the debate will not be impaired by enjoining its distribution.
Third, the Court noticed significant governmental interests served by California's trade secrets law, including innovation incentives, maintenance of commercial ethics and the protection of property interests. The Court found that the DeCSS content disclosed by Bunner did not address, involve or illustrate matters of substantial public concern, so that Bartnicki did not control its decision. It also saw a clear distinction from the cases involving attempts to enjoin publication of information lawfully obtained, such as:
* Florida Star v. B.J.F., 491 U.S. 524 (1989)
* Smith v. Daily Mail Publishing Co., 443 U.S. 97 (1979)
* Landmark Communications, Inc. v. Virginia, 435 U.S. 829 (1978)
* Oklahoma Publishing Co. v. District Court, 430 U.S. 308 (1977)
* Cox Broadcasting v. Cohn, 420 U.S. 469 (1975)
On balance, it found that the injunction satisfied the Madsen standard of scrutiny, assuming that an actual trade secret misappropriation occurred.
Fourth, the Court found that the injunction was not a prior restraint because it was content neutral and based upon unlawful conduct that had already occurred. It distinguished CBS Inc. v. Davis, 510 U.S. 1315 (1994) invalidating an injunction of a broadcast of a video revealing unsanitary practices in a meat packing plant. Without a finding that the video was unlawfully obtained, the substantial public concern with the facts revealed outweighed the interest of the packer in preventing the broadcast.
Finally, the Court applied the same analyses to dispose of Bunner's claims under the Constitution of the State of California.
It remanded the case for further review, emphasizing that the injunction was justified only if appellate review supported the finding that DVD CCA was likely to prevail at trial.
Judge Moreno's Concurring Opinion
In his opinion, Judge Moreno analyzed the case differently and questioned the validity of the trial court's findings. He cited the dangers that a preliminary injunction may work to bar protected speech before adjudication of the merits of the speaker's constitutional claims, citing Pittsburg Press Co. v. The Pittsburg Commission on Human Relations, 413 U.S. 376 (1973). Criticizing the majority's analysis as "incomplete," Judge Moreno characterized the injunction as "subject matter censorship" that was unjustified because DVD CCA's proprietary information had become so widely disseminated on the Internet that it was no longer actually a secret.
He acknowledged that trade secret laws serve sufficiently important societal purposes to justify limiting First Amendment rights in proper cases, citing:
* Cohen v. Cowles Media Co., 501 U.S. 663 (1991)
* Zacchini v. Scripps-Howard Broadcasting Co., 433 U.S. 562 (1977)
* Kewanee Oil Co. v. Bicron Corp., 416 U.S. 470 (1974)
* Ruckelshaus v. Monsanto, 467 U.S. 986 (1984).
He called for a more rigorous evidentiary standard in order to "separate meritorious trade secret claims from those involving protected speech." A plaintiff, contended Judge Moreno, "should be required to actually establish a likelihood of prevailing on the merits, regardless of the balance of harms," or the injunction may constitute a prior restraint.
He noted that a higher hurdle would also apply when substantial public concern was implicated, even when secrets were unlawfully obtained, as in New York Times v. United States, 403 U.S. 713 (1971).
He contended that there was "no likelihood that DVD CCA would prevail on the merits," because "DeCSS was not demonstrably secret in this case when Bunner republished it, and Bunner was neither alleged to be the original misappropriator nor to be in privity with any such misappropriator."
He also cast doubt on the use of a license agreement to nullify statutory language excepting reverse engineering from the definition of "improper means" of acquiring a trade secret, citing Bonito Boats, Inc. v. Thunder Craft Boats, Inc., 489 U.S. 141 (1989) and the exclusive jurisdiction of federal patent law.