An Introduction to the IP Infrastructure
Writing can serve to educate the author as much as communicate with the reader. I initially envisioned this essay as providing some technical background for those who should understand the new infrastructure. The technology itself is simple. Its significance is in the marketplace and the consequent economic and societal effects.
Technologies themselves may be neutral but their characteristics have consequences. The IP infrastructure creates a free marketplace for innovation. The question then is not whether it is good or bad, but how to adapt to a world in which it is a given. Why is this so? What does it mean?
The goal of the Internet was initially to interconnect local networks around the world. The term itself applies to the collection of networks that share a common Internet Protocol, or IP. The need to span disparate networks has the benefit of forcing designers to shed the design assumptions implicit in networks built for local needs.
The result has been a communications infrastructure reduced to its most basic role of providing connectivity. At the same time, the rapidly developing computer technologies were changing the basic rules of design. The telephone network was a remarkable feat of engineering and was only possible by carefully tuning network to the specific needs of telephony. With the newer digital technologies, the advantage shifted to flexible design and the ability to take advantage of unanticipated opportunities. Packet Radio Networks, the Ethernet and the Arpanet provide test beds for some of the newer ideas.
The stage was set for the IP infrastructure. What started out as a solution to an engineering problem resulted in a fundamental change in the rules of communication. By avoiding the biases inherent in previous networks, the IP infrastructure has created a marketplace for new concepts that can take advantage of abundant connectivity.
The Importance of the IP Infrastructure
The IP infrastructure represents a key boundary between a communications medium and the applications that are built upon this medium. It is a very special boundary since it allows two very different marketplaces to operate on their own terms.
Traditionally, applications that required connectivity, such as the telephone and television systems, required an infrastructure built for that purpose. In order to provide telephone service, you first had to make major investments in physical plant and in special hardware at the end points (telephones). In order to create the television industry, we had to standardize every aspect of how cameras and televisions worked and how they exchanged signals. Both were amazing engineering feats in their times, but the interdependencies between the elements of the system has made change exceedingly difficult.
A side-effect of such vertically integrated systems was that the providers were able to effect tight control over the infrastructures. ATT, especially in its heyday, could pride itself on only allowing you to buy the highest quality phones that would last for many years. Even now, a phone company can charge a high fee for a service such as caller ID, which costs them absolutely nothing. In fact, they needed to add capabilities to prevent you from getting the caller ID information.
We can contrast this with the very competitive market in commoditized computation, especially in the arena of personal computing. We often use the terms "Moore's law" to refer to the rapid decrease in the price of computation. Improvements of price/performance of a trillion to one are not an exaggeration.
But during this period the communications industry has stagnated because the vertical integration and the consequent dependencies between the layers, has allowed the providers to charge, not for the communications itself, but for the services they build. There is no incentive to follow Moore's law either in pricing or, more important, in creating underlying technologies.
IP eliminates the interdependencies by separating expensive communications infrastructure from the applications that are built on it. Rather than buying a word processor, we are used to buying a computer and adding word processing as a software application. We can shop for the best computation and then buy the best word processing software. Microsoft's success is due to focusing on the software and letting others innovate on the hardware.
We can think of the IP infrastructure as a currency that replaces a barter system. Not only is it more convenient, but it allows the creation of a worldwide marketplace and the creation of new financial instruments. The IP infrastructure allows innovation in new applications that can simply assume ready availability of very inexpensive connectivity.
Current communications networks do more than transport data, they take responsibility for the meaning of the traffic. For example, the telephone network "knows" that it is transporting a phone conversation. At any point, you know how to interpret the signal and listen to it. We've had to disguise data traffic as a conversation in order to pass it through this network.
In the IP infrastructure, there are just bits. There are not telephony packets or video packets. There are just packets. Telephony, like word processing, is just an application that runs outside the infrastructure and just uses IP packets for connectivity. A term like "Internet Telephony" sounds precise but there is no one standard, just a set of conventions with some more common than others. Traditional telephony depended upon precise conformance to standards. The IP infrastructure can be shared without imposing such restrictions.
Since there is no semantics associated with the information within the network, one cannot regulate it and one cannot censor it. Of course, one can try, since it is possible to guess at the semantics. This is typically done by placing gatekeepers at the edges. For example, watching traffic on dial-up connections to make assure that no one can access Web pages deemed inappropriate. Doing so represents a fundamental denial of the nature of the connectivity marketplace. Even allowing modest use of the Internet, for example, browsing only approved sites, is very valuable. It is not obvious that there is far more value in allowing innovation than simply making existing applications more efficient. This is a classic problem that Clayten Christensen points out in The Innovator's Dilemma.
The difficulty of imposing controls and censorship has public policy and economic implications. Amartya Sen claims that where there has been a free press, there has been no famine. The IP infrastructure (or, loosely, the Internet] has the promise of being a free press to the world because of the difficulty of intervening without changing the very nature of the Infrastructure.
The economic value of the IP infrastructure is in creating a marketplace for innovation and discovery. If we can attribute much of the "new economy" to this vibrant marketplace, then the price for opting out of the infrastructure, is to elect stasis. This is an observation born of experience rather than from an a priori bias towards free markets.
What is the IP Infrastructure?
The Power of the Packet
The basic technology of the IP infrastructure is exceedingly simple. From a purely aesthetic point of view, the simplicity is an indicator of a correct solution to a problem. This simplicity makes it easy to extend the network without complex relationships. It also makes it easy for disparate developers to build applications upon the infrastructure without needing to understand the details.
The basic unit of the IP infrastructure is the "packet". A packet is like an envelope for mail. It has a destination address and a payload of data. Just as we can drop an envelope in a mailbox, we can drop these bits onto our network. Unlike the Post Office, the packet can be delivered in less than a second to anywhere else in the world whether the destination is ten thousand miles away or a few millimeters (digital ants?).
And, just like the Post Office, there is no guarantee that the packet will be delivered. The Post Office does provide services like certified delivery. For the Internet (to use the broader term), one can "purchase" reliable delivery as an application service.
For traditional network designers, the chaos of the Internet was a shock. The surprise is not only that such a chaotic network can work, but the chaos is a necessary part of creating a very large scale network. Understanding this has major implications not only at a technical level, but also at a philosophical level.
The seeming reliability of the older networks turns out to be naive. What is important is not whether the message got to the front door, but whether it reached its destination and had been accepted. This is called the End-To-End argument. For example, when using the network for telephony, delaying packets in order to assure that all are delivered interrupts the conversation. Better to drop lost packets and guess what they contained than impose a delay. It isn't always obvious what the final destination is. If we send a message to a light to turn on, is the operation complete when the switch is thrown, when the light itself glows or when (and if) the room is sufficiently bright. There is no one right answer, and a network that imposes one will only frustrate innovation.
Network reliability has a more pernicious effect by lulling the applications into a false sense of security. Network reliability is a minor issue compared to other failures. The Web works because the browsers do not trust the network and verify information from other systems. It makes no difference whether the other party is malicious or simple incompetent or, for that matter, if our local system is confused. This approach of cooperating but verifying has allowed the Web to grow to millions of system since failures do not propagate.
This is in sharp contrast to traditional computer systems that assume that the services they are built upon are reliable. In fact, many of the problems we have with computer systems are due to the naive assumption that we can conquer complexity by simply building layers to hide it.
By embracing chaos, we accelerate the commodization of the communications infrastructure. The emphasis in the physical networks is on being a very good commodity – inexpensive and plentiful. Overengineering for a presumed usage pattern is penalized.
More important, learning how to deal with a chaotic environment is a step towards learning to deal with reality. We are at the early stage of transition from computers and computation as something apart, something we see through screens, to being an integral part of just about everything. Devices will not be gratuitously dumb and isolated. More subtle will be the effect of this understanding upon our understanding of ourselves and, well, reality.
As an aside, the IP infrastructure also upsets the political order by making it very difficult to frustrate the exchange of information. This threat that leads many to fear its effect and to control it.
Under "The Hood"
While our primary focus is on the implications of this infrastructure, it is worth appreciating the effort that goes into making it simple to create new services. Unlike the phone network, there are no circuit paths, each packet must find its own way to its destination anywhere. The techniques for doing this are evolving rapidly. Since the capacity of the "pipes" (such as fiber optic cables) is increasing rapidly, it is possible to haul all the traffic to common switching points rather than traversing many many intermediate points. But handling so much traffic does require high capacity as the traffic exceeds trillions of bits per second through these points.
The traffic itself is very chaotic in the sense that one can't predict "data storms" which can overwhelm the network for a period of time. This makes planning difficult and is all the more reason that the end points need to be prepared to handle delays or packet losses from time to time. The issue of Quality of Service (or QOS) is a major concern in systems architecture. The best solution and, perhaps the only one, is to provide much more capacity than is needed at any point in time. Given the rapid growth in traffic, assuring sufficient capacity continues to be a challenge.
The network itself needs to overcome some legacy issues and the bad policy based on a lack of understanding of the nature of the network. One problem is simple, the failure of imagination in the 1970s that has left us with an address that is too small to handle a truly universal network. It is the Internet's version of Y2K. Unfortunately, those currently on the network have tuned their usage to this limit and have difficulty making expansion a priority. The result is that new applications are frustrated by having to work around these limits and thus lose some of the key benefits of the simplicity of the IP layer. The hope is that the pressure of new applications will once again force the issue just like has in the past.
Bad public policy based on the naive notion that one can differentiate between the different kinds of bits on the network is also a threat because it requires the network know about the applications. This goes beyond merely biasing the design towards well-understood uses, creative uses are threats because they are viewed as attempts to avoid these controls. Much of what is called Internet policy is really rethinking social policy in the context of the new infrastructure. Unfortunately, the rules are often aimed at the infrastructure itself rather than the applications. It is like regulating the water supply in order to limit what drinks people can mix.
The Internet does, in fact, force us to reexamine many social policies. The problem is in treating these as Internet issues rather than social policy issues.
Effects of the IP Infrastructure
The personal computer has enabled us to create word processors and spreadsheet machines by just adding software to readily available hardware. Not only was this a low overhead process, and thus low risk, it was also far more rapid than a process that would have required new hardware for each new application.
The power of rapid iteration is illustrated by the Simple Mail Transport Protocol, or SMTP. This is a very simple protocol, one that could be implemented in the requisite weekend and then tested with just a standard computer and a keyboard. By contrast, all the telecommunications providers in the world and their governments rallied around the official standard, X.400. The problem with X.400 was that took a five years to get enough experience to uncover its flaws and then another five years for the next version. By hindsight, it is hard to see how X.400 could have succeeded against a protocol that not only evolved rapidly in the face of experience but one that benefited from the advancements in technology, namely the availability of universal connectivity in the guise of the Internet. Unlike X.400, SMTP could focus on mail exchange while X.400 had the burden of completely specifying every aspect of email.
The Web is similar. A very simple way of marking up text so that it can look good with some embedded pictures and a very loose way of referring to other systems on the network allowed for rapid deployment and innovation. The familiar URL (Universal Resource Locator), http://www.whatever.domain is just a text description. It's target may no longer be available, but the only consequences is that the browser just says that it can't find the page. Perhaps it will be available later, perhaps not. Since this is a common occurrence, it is annoying but not fatal. Because HTML itself was simple, there was no need to wait for tools and, when the tools became available, the users were able to avoid the limitations of the tools and the assumptions built into the tools by directly editing the HTML itself. Again, the basis for rapid innovation.
Just as SMTP defeated X.400, the entire telephone network is rapidly disappearing into the Internet. How could a network built for one purpose compete against a commoditized communications infrastructure? Price is just one issue, the bigger threat is that whole definition of telephony built into the phone network is open to question. What happens when a phone call is just a software application and there is nothing to charge for aside from the low cost of the Internet connection? Why not have high quality sound or simply keep a connection open all the time. The only issue is assuring sufficient QOS or Quality of Service. For a telephone connection this may mean less than a tenth of a second delay and about 32 thousand bits per second. Given a network that can carry trillions of bits per second, this is likely to become a nonissue as the capacity of the network continues to increase. Voice traffic is simply negligible compared to the large amount of traffic carried by the infrastructure.
The current battle over HDTV (High Definition TV) and Digital TV, is going to seem strange in retrospect. Here too, the capability of the Internet is growing far more rapidly than the political processes associated with the television infrastructure. It will soon seem strange that a provider, such as a cable company, decides which "channels" I can view. Even the concept of a channel is just a result of scarcity due to preallocating limited capacity on old networks.
We're already seeing a hint of this with radio in the Internet. You can tune to any station that provides an Internet feed without limitations of geography or and without FCC restrictions. Only a limited number of existing radio stations have taken advantage of this opportunity since their business models were built for the old infrastructure and a revenue model based on local advertising. New entrants, such as an MP3 music provider, may not even think of itself as a radio station.
Replacing existing technologies is only a small part of what is possible. The IP infrastructure is just as important within a building as across the world. Devices can be connected and can cooperate in this medium without any special wiring (or even without wiring). Any investment in new wiring is a good investment since it increases the general capacity.
My alarm clock can readily communicate with the coffee maker. The IP infrastructure dictate whether they have a meaningful exchange. It's up to the marketplace to learn by doing. Since doing is just a matter of software and protocols and the penalties for a mistake only affect a few devices, expect to see rapid learning.
This same process applies to industrial systems. Manufacturing equipment in one country can be tied to process management systems anywhere else in the world...
If information flow is indifferent to boundaries, and information flow becomes more important than the flow of hard goods, we have a fundamental shift in the rules of International commerce and the relationships between countries. In effect, the advantages of a coast line in connecting to the rest of the world become available to all.
Of course, the first response is to attempt to retain the status quo of controlling information flow. But the price paid is to abort participation in the innovation-driven marketplace. This isn't to say that such information flow is automatically good. We don't know what the effect of unfettered money flow will be. But we'll have to learn how this system reaches equilibrium and what policies might be necessary to keep it from going into failure modes. Antitrust (without judging whether it is a good idea) is an example of such an attempt to address boundary issues.
The first amendment of the US constitution was such a gamble by allowing very free speech. This gamble has paid off remarkably well, even though there a still many in the US who fear the effects of such uncontrolled access to information or perhaps, to knowledge. A policy of extending the IP infrastructure throughout the world is a way of extending the bet worldwide. And the payoff can be higher in creating a way to share information, not just publish it.
The Free Market
The scientific method, despite how it is badly taught, is not a cookbook set of rules. It is a way of thinking by requiring that ideas be testable and hypothesis be adjusted in face of the result of the test. A "fact" (more properly, an hypothesis) that cannot be tested cannot be proven. Untestable ideas and philosophies can exist but with the caveat that they are subject to question.
In the same way, the marketplace is a mechanism for proving concepts and adjusting in the face of experience. It differs from science in that the process is not necessarily convergent and proofs are rarely unambiguous. But it does contain the necessary mechanisms for adapting to change and evolving new solutions out of its chaos. It is this chaos that provides the power and vitality of the system. We can look at techniques, such as just-in-time manufacturing, as ways of providing even more effective feedback. Economies that have tried to avoid feedback have suffered. Most recently Japan has suffered from an economic system that failed to flag major problems.
The increasing reliance on the marketplace is part of a general shift in thinking about social and economic policies in the last few decades. It has created and benefited from an abundance. We see this abundance in the capacity of the IP infrastructure and even in a surplus economy. But "zero-sum" assumptions in which if one gains someone else loses are still common and implicit. The IP infrastructure is important philosophically because it helps us understand these larger trends and gives us a vocabulary for talking about it.
IP Design Issues
The chaotic nature of the IP infrastructure has been a major asset in facilitating growth. The ability to tolerate chaos has made it resilient to failures. Failures are normal, but the net, as a whole, continues to function.
More serious are the design limitations left over from the initial design in the 1970's when few could foresee their limited experiment growing into the infrastructure it has become:
We are also poorly served by traditional computer science that has confused solutions created for very simple isolated systems with approaches suitable for the challenges posed by participating in a large infrastructure. The premise of approaches such as objected-oriented programming assume that the intrinsic complexities can be hidden. We have barely begun to explore, let alone understand, these issues.
The focus on precise computation needs to give way to creating systems that embody the rich complexities and ambiguities of the so-called real world. The emphasis of programming is shifting from coding isolated tasks to adding policies to a large ongoing infrastructure.
While the technologists are used to coping with new situations and see them as a challenge, it is much more difficulty for society, as a whole, to come to terms with such fast changes in the fundamental rules of the game. Transitions are disruptive and rife with unintended consequences. For example, prior to the sewing machine there was a large-enough demand for seamstresses, that the occupation could act as an employment of last result. With the advent of the sewing machine an explicit welfare system became necessary.
When change is very rapid, it is difficult to wait for a new generation of individuals who understand the new environment. It is even more difficult when those setting policies try to apply their "common sense" in a very different environment. A term like "the Internet" gives the illusion of a common understanding while often frustrating deeper dialog. The Internet cannot be managed as if it were the phone network. The Web is not just another publishing medium like a newspaper or book.
The rapidity of change is itself, difficult to grasp. How does a multibillion dollar industry like the telephone network adapt when it becomes an insignificant part of a much larger infrastructure. How can the television industry and its regulators understand that its specialness is only a temporary franchise?
But these are minor issues compared with the global implications of abundant and unfettered connectivity. It would be naive to assume that we'll be one homogeneous paradise. We simply cannot predict the new social groupings and rules of interaction. How will individuals seek advantage, what will be considered criminal behavior?
On an economic level, what are the implications of an unregulated global economy with the ability to readily invent new financial instruments and concepts? This will be especially difficult during a transition period of mixed economies where opportunities abound for arbitrage and gaming.
Programming and Software Engineering
Just as the term "true" in mathematical logic bears only a passing relationship to the word "true" in human discourse, the ability to write a program is only a small part of solving problems. Concepts like bug-free programs and object-oriented systems do not translate well when we shift the focus from solving local engineering problems on systems in isolation to trying to add intelligence to the devices and services connected via the IP infrastructure.
The most valuable lessons come from the failures of computer science. Those trying to make computers act more like people by creating an artificial intelligence found their naive models wanting. Translating information into databases is very difficult. In fact, the Y2K problem is largely a result of taking practices that work well on paper, such as omitting the century, and using them in contexts where the implicit knowledge (such as what "now" is), is not available.
Traditional programming is about taking a precise specification and translating it into instructions that an idiot savant machine can follow. It is a small part of the challenge that we confront in trying to translate our ambiguous, inconsistent and conflicting notions into policies.
The telephone network went through a similar transition when we transfer the responsibility for placing a call from the operators at the central office to users by giving them dials.
The IP infrastructure is a much larger challenge. We cannot hide the intrinsic concepts posed by a network of autonomous devices and systems used between idealized rules and objects. We cannot force people to be users who conform to preconceived rules. Conversely, those who do not understand the concepts of the IP infrastructure will be at a disadvantage compared with those who do.
The concepts we developed in dealing with complex computers systems, especially when we challenge our thinking in translating seemingly simple concepts like understanding speech, are valuable in confronting these challenges. But there is a large danger, as the Java fiasco demonstrates, in naively assuming that the IP infrastructure is just another small matter of programming (SMOP is an acronym programmers use to refer tasks which seem much easier than they are).
Copyright 1999, Bob Frankston