Hinweis: Wenn Sie diesen Text sehen, benutzen Sie einen Browser, der nicht die gängigen Web-Standards unterstützt. Deshalb wird das Design von Medien Kunst Netz nicht korrekt dargestellt. Die Inhalte selbst sind dennoch abrufbar. Für größtmöglichen Komfort und volle Funktionalität verwenden Sie bitte die empfohlenen Browser. |
Phil Agre
«The Internet and Public Discourse»
Many legal systems, for example in the United States, have had difficultycomprehending the Internet because incompatible precedents based on so manyexisting media (post, telephone, newspaper, street corner, etc.) seem to apply. TheInternet frustrates these traditional analogies because it is really a meta-medium: aset of layered services that make it easy to construct new media with almost anyproperties one likes. Despite this great flexibility, however, the dynamics oftechnical standards are emerging as a potentially conservative force. To help inmapping afresh the legal and political concerns that the Internet has raised, thisarticle sketches a series of four models of the interaction between Internet
architecture and public discourse.
Contents
The Internet as a communications medium
The Internet as a computer systemThe Internet as discourse
The Internet as a set of standards
For all of its precision in the technical realm, as a social phenomenon the Internetstill seems inchoate [ 1 ]. Analyses of law and policy have found it remarkablyhard to fit familiar models to the Internet. I want to explore why this is, and to takea shot at remedying the situation. When we talk about "the Internet", of course, wecould mean a lot of different things. We could be talking about the TCP/IPprotocols and the computers that use them; on this view the Internet is a big electriccircuit that happens to cover the earth, or at least the relatively affluent parts of it.
But for the most part, when we talk about the Internet in the context of importantpublic issues, we mean to refer to something larger, and to give shape to theintuition that the Internet is increasingly bound up with the conditions and practicesof public discourse.
I propose, therefore, to sketch a series of models of the Internet -- a series ofanalyses of the relationship between the Internet and public discourse. In doing so,
I hope to provide rational reconstructions of several of the most widely publicizedand broadly contested controversies that surround the Internet, and perhaps supply avocabulary for addressing those controversies more systematically. I will describefour models.
The Internet as a communications medium
The first of these models suggests, very simply, that the Internet is acommunications medium. I take this to be the dominant model, and the dominant
terms in which a whole host of controversies surrounding the Internet have beendebated. When we consider the Internet in this way in the context of specificdisputes, however, the question immediatedly arises of which medium the Internetis. Is it the telephone? Newspaper? Television? Lecture hall? Street corner? Theproblem is that the Internet, in its many real and envisioned applications, seems toafford all of these analyses, either separately or in monstrous combination.
Arguments over constitutional matters such as the Communications Decency Act,
or business matters such as the so-called push technologies, or policy matters suchas Internet telephony, are effectively debates over which precedent shall apply.
Given that every party to these debates typically finds one of the precedents morecongenial in its consequences than the others, we are often too busy fighting to takein the awesome extent to which the answers to our questions are indeterminate. TheInternet, considered in this way, is very nearly whatever you want it to be.
This would not seem like a good situation. From the point of view of technicalpeople, however, the situation is not at all paradoxical. That you can make theInternet into whatever you want is, for them, precisely the point. The Internet is akind of meta-medium; the strategy of TCP/IP is to interpose a new service layerbetween transport and applications, so that developers can choose their metaphorswith little concern for how the stuff gets moved around. Digitalization is of coursethe first key to this strategy, but there is more to it. Nor, I might mention in passing,
is the Internet the only example of the strategy. The so-called software radio willshortly permit designers to decouple the formats and protocols of wireless dataexchange from the technically horrid details of their analog hardwareimplementation [ 2 ].
The Internet as a computer system
Let us, therefore, try again. The second model treats the Internet as a computersystem, a product of a more general set of practices of system design. AndrewFeenberg [ 3 ], among others, has observed that computers have a dual character
specifically, that computers are representational machines that represent the worldin at least two different ways. One of these is as a medium; the test is roughlywhether the machine analyzes the representational stuff at the level for which it ismeaningful to people. WordPerfect doesn't know the genre of your document, andPhotoShop doesn't know what is going on in your images.
At another, more basic level, however, a computer operates on the basis of asystematic analysis of the world to which its computations are supposed to refer.
The first step in designing a computer system is the construction of a data model,
or what philosophers call an ontology - an enumeration of the types of things thatthe designer supposes the world to contain [ 4 ]. These categories, so-calledentities, might include people, cars, bank accounts, products, documents, orcomputers. They might also include entities within the machine or network, such asprinter jobs. The idea is that the operation of the machine presupposes, and dependsupon, the maintenance of an accurate one-to-one correspondence between the datarecords in the machine and the real things in the world that the data records aresupposed to represent.
This fact, seemingly simple enough, has vast consequences; it directs our attentionto the tremendous variety of material arrangements by which the internal workingsof machines are tied to the rest of the world. A machine can only compute withwhat it can capture, and so the world must be instrumented accordingly, whetherthrough paperwork or tracking devices or ID cards or heaven knows what [ 5 ].
Even beyond this, consider the consequences of a simple computational operationsuch as the addition of two numbers. If a machine contains one number that
originated in New Jersey and another number that originated in Idaho, the sum ofthose two numbers is only meaningful if the numbers are commensurable, that is, ifthe same sorts of things exist to be measured in both places, and if bothmeasurements were conducted in the same way. If computers are to perform a greatdiversity of meaningful computations, as they do every day, then the world must bestandardized in a great diversity of ways [ 6 ].
My main concern here is not with numbers, however, but with the Internet and itsplace in public discourse. And for that purpose the ontologies that matter most areprecisely ontologies of discourse, that is, the elements that computer system
developers have imagined discourse to comprise. So far as narrow matters oftechnical practice are concerned, designers enjoy a vast freedom to choosewhatever categories they like. This is the sense in which the Internet is a metamedium: Internet-based applications can be designed using ontologies derived frommany spheres of life, including the various media industries and otherconventionalized forms of communication. Of course, the Internet functions as a
newspaper, or as a telephone, or as a lecture hall, et cetera, to the extent that the
software is coupled to an institutional field - the one within which the ontology ofthe newspaper or phone system or university already functions. The couplingsbetween most Internet applications and their institutional surroundings have thusfar been relatively weak, and this has contributed to a sense of the Internet as awholly separate sphere. This situation, however, is changing rapidly as the Internetis integrated into the workings of institutions.
We should be concerned with this coupling in many ways. The designer's creativefreedom, for example, sounds like a kind of power, the power involved in definingone's ontology in one way rather than another, and the consequences ofimplementing that ontology on a new kind of hardware that comes with manifoldinstitutional couplings of its own. For one thing, every datum that is captured in adigital medium can in principle be stored indefinitely and reused easily for anypurpose. Communication that might otherwise have been bounded by four walls, orthe expense of photocopying, or the vagaries of human memory, now exists inPlatonic perfection as a digital record that can potentially be submitted to a widevariety of other purposes. As a result, the regulation of those purposes arises as asystematic problem that had formerly been kept within relatively manageablebounds by the enabling and constraining limits of the physical world, or ofprevious, less generalized media.
The Internet as discourse
To reckon with at least certain aspects of the seemingly wide-open design of newdigital media, it will help to sketch out a third model of the Internet. Our startingpoint is once again the main tradition of computer system design practices, but nowunder a different aspect. From this perspective, what system developers do is totransform social discourse into machinery. Paradoxical as this description maysound, it is only a mild inflection of system developers' own understanding ofsystems analysis: one starts with a corpus of discourse, namely someone'sexplanation of what the system is supposed to do, and one performs grammaticalanalyses on this discourse. The nouns - car, person, bank account - become entitiesin the aforementioned data model, the verbs - register, hire, open - become thenames of procedures and methods, and so on. The question that is not a part ofsystem developers' sphere of professional concern is where the discourse comesfrom. In even referring to it as a discourse, I intend to point to its social origins: theinstitutional processes, with all of their strengths and limitations, through which thediscourse arises.
The Internet makes a fine example. The Internet's predecessor, the ARPANET, wasthe implementation of a particular discourse - the Advanced Research ProjectsAgency's discourse about the American scientific community and its infrastructuralneeds. We can see this whole discourse as a discourse by looking at the spectacularcareer of the Internet in subsequent years. The original ARPANET discourse, likeany discourse, made a series of unarticulated or partly articulated assumptions, andthese assumptions were, so to speak, built into the protocols. One assumption wasthat the user community had a strong capacity for collective self-regulation, so thatthe network need not be terribly secure. As the Internet's use has spread beyond thescientific community, all manner of holes have become visible in the Internetprotocols. Peer pressure in the scientific community is sufficiently effective that onewould not even think of scientists sending spam, at least not routinely or on amassive scale. As a result, a variety of weaknesses in the Internet's electronic mailprotocols have only become evident as spammers have begun to exploit them in thelast couple of years.
This example, and others like it, point to a process of social discovery that is partand parcel of all technology adoption, and particularly the adoption of distributedcomputer technologies. It is a hermeneutic process: as the technology is used innew ways, we gain a deeper understanding of the ideas that motivated it. Thoseideas, and the discourses that convey them, have their own historicity, their ownmetaphors, their own depths of unarticulated assumptions, and as we hit ourselveson the head in the adoption and adaptation of new technologies, we create theconditions for bringing those depths somewhat more fully into consciousness.
Moreover, because of the aforementioned ability of digital media to repeal thefrequently useful limitations of the physical world, as disputes arise in the newcontext we are frequently forced to conceptualize more deeply the moral bases ofour rules [ 7 ]. So long as walls functioned as walls, we could make laws aboutprivacy and property by making laws about walls. As electronic media increasinglybreach physical walls, we are compelled to articulate, fully now, the moral basis forprivacy and property without so much reference to the architectural basis. Andinasmuch as the walls of digital environments are simply discursive constructs likeany others, walls are increasingly located precisely where the law says they are,
and not just where custom and engineering practicality have placed them. This shiftcan be overemphasized (law has always had opinions about where walls should go,
walls had already been breached by other technologies before the Internet camealong, and so on), but its direction can hardly be denied.
The Internet as a set of standards
It is evident, therefore, that the discourse-made-machinery that constitutes theInternet has a political significance that is almost frighteningly profound. Computersystems are the products of discourse, among other things, and they are, amongother things, important media for such discourse. To comprehend this reciprocalrelationship between the Internet and social discourse, it will be helpful to articulatea fourth and final model of the Internet. Our focus here is on standards. Ted Nelson
[ 8 ] accurately asserts that the software industry is about the politics ofstandardization. And as we have seen, both here and in Larry Lessig's [ 9 ] analysisof content filtering software, it also works the other way: software design, at leastsome of the time, sets the standards of politics. Put another way, the antitrustconcern with the control of standards is a dialectical complement of the freeexpression concern with the standards of control. We care about standards becauseof the fantastically complicated economic question of who captures the oftenconsiderable value that is created through the establishment of a standard [ 10 ].
And we also care about standards because, as we have seen, they arise through thecondensation of processes of social discourse. Social discourses are not neutral orinnocent; to the contrary, to at least the extent that our discourses about discoursetake substantive positions about the nature of society and social relationships, thestandards of emerging media of social discourse tend to embody these positions aswell. This is a rough and simple statement of something that requires considerablymore analysis, but I think it at least accurately captures one of our concerns.
Underlying each of these concerns are the economic dynamics of standards, andparticularly the technical compatibility standards where the issues most sharplyarise. The work of Paul David and many others suggests that standards are pathdependent, and that because of network effects they tend to have a winner-take-allquality, with one standard becoming dominant and devotees of other standardsbecoming stranded [ 11 ]. Neoclassical economists have mounted a sophisticatedcounterattack on these models of market failure [ 12 ], and the matter is anythingbut settled. My purpose here, however, is not so much to settle it as to delineate thespecifically political reasons why we care about it. Roughly put, to the extent thatInternet standards shape public discourse, their rule-setting function is a matter ofpolitical concern. And to the extent that the Internet serves as a medium for theagenda-setting from which a wide variety of technical standards emerge, theproperties of that medium and the larger technical public sphere of which it is a partare likewise matters of political concern [ 13 ].
This political perspective illuminates both the economic and the technical dynamicsof standardization. One way that standards create social value is through what wemight call economies of generality. To the extent that activities in a series of sites
can be fitted to a common framework, many types of information and knowledgework achieve greater economies of scale. In enterprise computing, for example, thetrend is away from custom-built systems that reflect the ontologies and discoursesof particular organizations to the adoption of standardized software modules,
bought off the shelf and configured for each organization's needs. The price here isthe work of conforming the organization to the software package; one benefitamong many is that the cost of developing the package can be distributed acrossmany organizational users. As Nathan Myhrvold puts it, with personal computersoftware you can get $100 million worth of software for $100, and so it goes withincreasingly many de facto standard software packages as well [ 14 ].
The concern here is not precisely the imposition of bland homogeneity anduniformity upon the whole world; the establishment of standards on one layerfrequently creates the conditions for an explosion of creativity on the layer above.
The concern, rather, is that this burst of creativity, too, becomes subject to the samepath-dependent, winner-take-all kind of standardization that made it temporarilypossible. If it does not seem substantively crucial which personal computeroperating system takes over the world, or which internetworking protocol, considerthe emerging tornado of activity, one or two layers up, to build new infrastructurefor digital universities [ 15 ]. Such projects may bring new efficiencies, but also thedanger of a greater degree of ontological standardization, not to mention apotentially greater capacity for the regulation of content, as lectures and classdiscussions are captured digitally for the ideologically motivated to peruse [ 16 ].
None of this is inevitable. My analysis describes concerns and dangers, forces andpatterns, not essences and predictions. Nonetheless, if this model-building exerciseaccomplishes anything, may it provide an emphatic counterpoint to the romanticmillennialism that portrays the Internet as the end of politics and the guarantor ofdecentralization. It is neither. To the contrary, the economics and the politics of theInternet are as one, and the institutional transformations that the Internet is alreadyfacilitating are political processes in the deepest possible sense - a near-totalrenegotiation of the mechanisms and mediations of our lives together here on earth.
About the Author
Philip E. Agre is an associate professor of communication at the University ofCalifornia, San Diego. He received his PhD in computer science from MIT in 1989and taught at the University of Chicago and the University of Sussex beforearriving at UCSD in 1991. He is the author of Computation and Human Experience(Cambridge: Cambridge University Press, 1997) and the coeditor of ComputationalTheories of Interaction and Agency (with Stanley J. Rosenschein; Cambridge: MIT
Press, 1996), Technology and Privacy: The New Landscape (with Marc Rotenberg;
Cambridge: MIT Press, 1997), and Reinventing Technology, RediscoveringCommunity: Critical Studies in Computing as a Social Practice (with DouglasSchuler; Ablex, 1997). He also edits the Red Rock Eater News Service, an Internetmailing list that distributes useful information on the social and political aspects ofnetworking and computing to 4000 people in 60 countries. His home page islocated at http://communication.ucsd.edu/pagre
E-mail: pagre@ucsd.edu
Notes
1. This article is a revised version of a paper presented at the 1998 Annual Meetingof the American Association of Law Schools, San Francisco. I wish to thank Jon
Weinberg for organizing the panel on "Fitting Models to the Internet" of which thepaper was a part. I also appreciate the comments of the anonymous referees and thebibliographic assistance of Paul Jonusaitis and Dave McArthur.
2. Joe Mitola, 1995. "The software radio architecture", IEEE Communications
Magazine, Volume 33, number 5, pp. 26-38.
3. Andrew Feenberg, 1991. Critical Theory of Technology. New York: Oxford
University Press.
4. On data models see Graeme C. Simsion, 1994. Data Modeling Essentials:
Analysis, Design, and Innovation. New York: Van Nostrand Reinhold. On their
significance for the present argument see Philip E. Agre, 1997. "Beyond the mirrorworld: Privacy and the representational practices of computing", In: Philip E. Agreand Marc Rotenberg (eds.), Technology and Privacy: The New Landscape,
Cambridge: MIT Press.
For a more sophisticated approach to the ontology of computation, see Brian Smith,
1996. On the Origin of Objects. Cambridge: MIT Press.
5. Philip E. Agre, 1994. "Surveillance and capture: Two models of privacy",
Information Society, Volume 10, number 2, pp. 101-127.
6. Geoffrey Bowker, 1994. "Information mythology: The world of/as information",
In: Lisa Bud-Frierman (ed.), Information Acumen: The Understanding and Use ofKnowledge in Modern Business, London: Routledge.
7. For one example of this pattern, see the analysis of the obsolescence of thecommon-law concept of negotiability in Raymond T. Nimmer and PatriciaKrauthouse, 1995. "Electronic commerce: New paradigms in information law",
Idaho Law Review, Volume 31, pp. 937-966.
8. Personal communication.
9. Larry Lessig, "What things regulate speech", available through Cyberspace Law
Abstracts, at http://www.ssrn .com/update/lsn/cyberspace/csl_papers.html
10. Michael L. Katz and Carl Shapiro, 1994. "Systems competition and networkeffects", Journal of Economic Perspectives, Volume 8, number 2, pp. 93-115.
11. Paul A. David, 1985. "Clio and the economics of QWERTY", American
Economic Review, Volume 72, number 2, pp. 332-337. See also W. Brian Arthur,
1989. "Competing technologies, increasing returns, and lock-in by historicalevents", Economic Journal, Volume 99, pp. 116-131.
On Internet standards in particular, see Mark A. Lemley, 1996. "Antitrust and theInternet standardization problem", Connecticut Law Review, Volume 28, pp. 10411094.
12. S. J. Liebowitz and Stephen E. Margolis, 1994. "Network externality: Anuncommon tragedy", Journal of Economic Perspectives, Volume 8, number 2, pp.
113-150.
13. Timothy Schoechle, 1995. "The Emerging role of standards bodies in theformation of public policy", IEEE Standards Bearer, Volume 9, number 2, pp. 1,
10. See also Richard W. Hawkins, 1995. "Standards-making as technologicaldiplomacy: Assessing objectives and methodologies in standards institutions", In:
Richard Hawkins, Robin Mansell, and Jim Skea (eds.), Standards, Innovation and
Competitiveness: The Politics and Economics of Standards in Natural andTechnical Environments. Hants (England): Edward Elgar.
On the rule-setting function of Internet standards, see Joel Reidenberg,
forthcoming. "Lex informatica: The formulation of information policy rules through
technology", Texas Law Review, Volume 76.
14. Myhrvold's comments appeared in an interview in Wired Volume 3, number 9,
pp. 152-155,198, and at http://www.wired.com/wired/3.09/features/myhrvold.htmlIt should be noted that many products exhibit substantial economies of scale, forexample because of large fixed costs of production. Information commodities suchas software are distinctive in that their costs of production are almost all fixed. Forthe vast consequences of this fact in the case of media content, see C. Edwin Baker,
1997. "Giving the audience what it wants", Ohio State Law Journal, Volume 58,
number 2, pp. 311-417. Just as network effects are sometimes analyzed as demandside economies of scale because they consist of the mutually reinforcing benefitsenjoyed by the putatively homogenous consumers of a product or service,
economies of generality might be understood as demand-side economies of scope:
they depend on the possibility of applying the same software package to a largenumber of organizations despite their inevitable heterogeneity.
On the role of economies of scope in the development of the computer industry, seeAlfred D. Chandler, Jr., 1997. "The computer industry: The first half-century", In:
David B. Yoffie (Ed.), Competing in the Age of Digital Convergence. Boston:
Harvard Business School Press, pp. 99-100; and Kenneth Flamm, 1988. Creatingthe Computer: Government, Industry, and High Technology. Washington:
Brookings Institution, pp. 210-214.
15. See, for example, the ambitious and detailed ontology of discourse that is beingdeveloped in the educational domain by the IEEE P1484.9 Task Ontology WorkingGroup, http://www.manta.ieee.org/p1484. More generally, see Educom'sInstructional Management Systems Project, at http://www.imsproject.org/
16. Thomas Sowell, 1994. "Letting in the light", Forbes (12 September), p. 98.
Copyright © 1998, ƒ ¡ ® s † - m ¤ ñ d @ ¥