Preface
I don't like writing about the "Internet" because people think I'm writing about a particular technology. This misunderstanding is endemic. In looking at the physical details of the Internet it's easy to ignore the concepts themselves. The basic concept of defining infrastructure from the edge gives us a way to understand complex systems such as marketplaces and social systems. Just as a word processor is not just a typewriter with a screen, Internet telephony is not a just a way of bypassing the phone network.
Terms like "Internet telephony" are confusing. It's as if we referred to email as "Internet Letter Carrying". We didn't just replace the typewriter with the word processor and we didn't just replace the post office, we created an infrastructure that changed the nature of communications. In the same way, the ability to replace the entire multi-billion dollar phone system with a small amount of software that anyone can run on their own computer is impressive in itself but it is only a hint of what we can do once we stop thinking in terms of telephony as it was defined in 1886.The ability to create our own solutions gives us a taste of what is possible as we learn how to take control and define the meaning of the infrastructure at the edges rather than waiting for "them" to do something.
The Internet is a laboratory for understanding how we work together to build systems without the need for a central authority setting the rules. We assume that the lack of a central authority would produce anarchy but we find we have far more cooperation and a far greater ability to rapidly solve problems than we would have were we paced by a central authority.
In this essay I rethink traditional infrastructure from the point of view of the end-to-end concept with the Internet being just one example.
Where and What is the Edge?
The word "edge" comes with a lot of semantic loading so I want to be clear that I am using the term in a very general sense. It is a term for a boundary of a system. For the Internet the edge is where the user's TCP and UDP protocols meet the transports basic IP protocols. I am using the term in a general sense for this kind of boundary in other markets. There is no precise definition since the edge is a matter of interpretation or to put it another way, it is where the provider's interpretation meets the user's interpretation. When I read a serious book I might read it for unintended humor or I can view it in terms of its contribute to my comfort as I transform the wood pulp into heat.
The concept of the edge as a place for innovation is not new. Virginia Postrel uses the term Verge in The Future and its Enemies. Economists cite coasts as sources of innovation because they represent the boundaries between cultures. I'm trying to look into how this turmoil produces economic value and the importance of the edge as a fundamental mechanism that defines marketplaces.
The Internet as a Concept
The real importance of the Internet is in shifting our model of how systems and marketplaces work. Traditional infrastructure defines the services and thus the meaning of the network or marketplace. The Internet explicitly shifts the definition to the edge. Without guidance and governance we would expect to see chaos and disorder. Instead we find that very large scale systems emerge and continue to evolve far more rapidly than traditional systems. The Internet itself is very simple and thus gives us a laboratory for understanding how such marketplaces work.
Those who see the Internet in traditional terms fail to appreciate the power of the concept and keep searching for a visible hand that defines and governs the behavior. Instead of recognizing that the lack of such guidance is the defining characteristics of the Internet they seek to impose governance. Such efforts are ultimately futile though they can do damage in the short term. The real tragedy is in failing to recognize how these concepts apply to marketplace and our society as a whole. The current Internet is only a prototype and, if anything, it has too much governance inherent in the current protocols.
Feeling the Impact of the Internet
The New York Times article "Easing of Internet Regulations Challenges Surveillance Efforts" reports that FBI wants to impose rules on companies offering "voice over Internet" services (or VoIP). The problem is that VoIP itself doesn't depend on such providers. While there are companies that do offer telephony services packages using VoIP, VoIP itself is not a service. As with email, all I need is software on two systems that can talk to each other and they can create their own protocols. I've been using Skype to speak to my son while he's spending a semester in Australia. Its roots lie in instant messaging rather than telephony. I've already got a phone system and don't need to have Skype interconnect with that old system. Instead Skype and others can experiment with messaging rather than focusing on fixing an old system. The fact that Skype is usually far better quality (and sometimes far worse) than a phone call to Australia helps and may be due to being liberated from the design constraints of that system. It also happens to encrypt the connections. Today we naively assume that the only exposure is when we use wireless links. Wired networks at conferences are very easy to tap and we can't presume that a carrier is able to protected all links even when choose to trust the carriers themselves.
I was in Sydney Australia myself last month for an IEEE workshop on consumer electronics. Ralph Justus, President of the IEEE Consumer Electronics Society and also the VP of Standards and Technology for the Consumer Electronics Association (CEA) gave the keynote talk. He described the interacting standards of the television broadcast industry ranging from the characteristics of the radio signals to how to associate text with the video streams. It's amazing that we can get all of this to work.
The Internet represents a very different approach to providing services -- it doesn't. To those used to seeing television as a service this doesn't make sense because of presumption that only a broadcaster (or other provider) can create the service. That would be true if each service needed its own infrastructure and it was difficult and expensive to build that infrastructure.
What's different is that we now have fungible connectivity as a resource. When I want to view a video stream (AKA TV) I just connect to a source (or server). If the speed of the network isn't sufficient I may have to buffer the content before viewing it so I don't depend upon watching live video but eventually, as with voice, I will be able to. That's not really true -- I can depend upon live video -- but I may have to decide if I want higher quality (or resolution) or immediacy. Traditional Television doesn't give me that choice. TV over defines the solution and thus limits the value of the service. Note that Firewire -- a standard coming form the TV world builds-in such assumptions which limits it's value as a medium. We still see such prescriptive approaches under the banner of QoS (Quality of Service). I don't have to think about how the stream is encoded -- most players automatically retrieve the appropriate decoder (codec) if the computer doesn't already have a copy.
In the traditional approach all of the elements must work together in order for the system to work. With the Internet we only need to find some combination that works and, lacking that, we can find our own solutions.
While I've emphasized the importance of connectivity as a resource, we started with purpose-built communications infrastructure and redefined it as connectivity. In the same we can convert other purpose built infrastructures and products into raw material for innovation at the edge.
The traditional approach with its interdependent standards can be made to work but only at a high cost within a limited domain. Telephony and television broadcasts were built upon such standards. The TV camera had to be precisely synchronized with the display of the image. In the absence of an alternative we managed to get it to work. Today there is competition that is far more agile.
I was attending the workshop to receive an award for engineering excellence for my home networking effort. My approach to home networking was very simple -- just extend the Internet into the home without making any assumptions about the content. My "competition" was the effort to provide a standard residential gateway that would be used to deliver services defined by infrastructure providers. Such efforts continue with little progress while home networking continues to evolve very rapidly.
Ralph spoke about the CEA's concern about DRM (Digital Rights Management) which makes it difficult for people to share information with their family. It's actually far worse when every element of the infrastructure must be aware of the rules for the bits and thus make innovation a threat to the infrastructure.
Consumer Electronics
In January I attended the CEA's International Consumer Electronics Show. I've been following consumer electronics all of my life and remember the introduction of stereo as a novelty. On the flight to Las Vegas I saw a TechTV show in which a lucky winner's house got a digital makeover that consisted of putting flat panel TVs everywhere. The technology may be digital but it is not empowering -- the TV broadcasters are still in total control of what is available to watch and the consumer is just that -- a consumer. To be fair, TechTV itself does try to educate people but it's very easy to talk about "digital" and "The Internet" and miss the real revolution.
There used to be two shows. To greatly oversimplify--Comdex was for computer geeks and CES was for digital entertainment. For a while CES was shrinking as more and more of consumer electronics got subsumed into computing but ultimately it was Comdex that fell apart because computer was too pervasive and it had to decide if it was a show for corporate computing or for retail/consumer computing. As CES absorbed consumer computing Comdex had to reposition itself as a show for "IT" or Information Technology. The concepts of computing and connectivity are too pervasive to define a single industry.
CES itself is still spread across a broad industry though the gamers now have their own shows as does the adult entertainment industry. The main floor of the current CES is dominated by the major consumer electronics companies touting their latest wonders with lots of attention to television and other packagings of digital technology. Samsung, for example, has its DigitAll theme. Panasonic has its PC-based home entertainment server.
Ultimately I find the packaging very frustrating. I have a Panasonic two-line cordless phone system at home. It has a limit of 8 extensions which is not enough. I also want a better paging capability but I can't mix and match features because all of the phone systems are closed and proprietary. I can't even use instruments from other Panasonic phone systems! Yet Panasonic was also showing a line of web camera with 802.11 access and motor control. Traditionally such devices would be part of a "solution" such a home security of baby monitoring but the PC world has created a marketplace for such components.
There is no sharp distinction between components and "solutions" -- it depends on how I use it though some are easier to build on than others. The key is having an interface that allows me to write my own software. Even then it is not entirely clear. I'm very excited about screen scraping -- the ability to operate a web site with a program rather than typing it. While some sites do have programmatic interfaces such as web services I don't need to depend on their availability. Even when the information is available it may not be what I want. Financial sites allow me to download only a subset of their data to Quicken but I can write a program that gets more information including images of checks (the few that I still write).
I'm not alone in trying to take control at the edges. Microsoft showed their 1080p video (WMV/HD). The television broadcast industry is struggling to get to 1080i. "I" means interlaced -- successive frames of the picture alternative odd and even scan lines which creates a fuzzier image than "p" or progressive. The reasons for interlacing go back to technical hacks that were important back in the 1940's and now just make it difficult to work with video. Microsoft is not alone in providing alternate video distribution but may be able to cajole companies to make big release movies available in this format. Industries that require major investments and decades to "upgrade" can't compete with distributed innovation at the edges.
Companies steeped in a tradition of providing complete solutions find it hard to even grasp the concept of innovation at the edges. Digital systems have changed the rules because it is becoming increasingly feasible for people to do things themselves. In traditional infrastructures all of the elements within the infrastructure have to cooperate and thus it is vital that they all conform to a well-defined standard. Such standards can take years as experts from competing companies try to come to an agreement. They must get it right before shipping unlike those at the edge who learn by doing. I often cite the competition between X.400 (a world-wide government supported email standard) and SMTP (Simple Mail Transport Protocol -- a temporary hack). SMTP didn't win over X.400; X.400 was just incapable of becoming a player at the edge.
Standards are best when they arise from experience. As long as I can write a program to control (this is a bi-directional process since I must listen to the device as well as speak to it) I can enhance it. If I have two devices with different interfaces I can deal with it. This isn't a long term problem because the manufacturers do learn and the interfaces tend to converge but should always allow the option to provide an alternative.
Notice that I used the terms "listen" and "speak". I'm not trying to anthropomorphize the technology. It's just that those words capture the correct ambiguity. "Listen" means that one not only receives a signal (such as a sound wave) but also interprets it. What means to "understanding" the signal is an open issue be it a machine or person. Operationally it is sufficient that it does enough to act on the signal and that's true in both cases. We need to be careful of simplistic interpretations. That's why I refer to the Federal Communications Commission as the Federal Speech Commission because it is actually regulating social behavior in the guise of defining neutral technical policies.
This confusion pervades the discussion of the Internet and treats it as a world apart from our so called "real" world. Much of what is considered an "Internet" concept such as the "blog" is really a social concept. It's enabled by the new technologies but we shouldn't identify it with the underlying technology itself. We can't fully separate out the social issues from the technical. The technology does give us new options for social interaction. At the same time, my attempt to explain the technology is itself a social process. I'd rather be hacking some programs at the edge or doing design or, perhaps even, being entertained. My social and self-imposed obligation to blog does provide a necessary prod to try to communicate my observations to a wide audience.
Participating in a larger community is valuable for both social reasons and technical reasons. On the technical side it helps in establishing standards. We don't want our standards to prescribe solutions; they do allow us to build upon shared experience. Thus if I create a set of tools that take advantage of an open interface then others can contribute tools to make the interface more valuable and thus encourage manufacturers to conform.
The words standards, specifications, interface, open source are different but in social terms they are simply ways of cooperating and forming communities. If we lose sight of this purpose and don't trust the marketplace and social processes we tend to redouble efforts to enforce standards rather than allowing them to evolve and adapt.
Which brings me back to the Internet itself�
I find Skype fascinating because it is solving its own problems at the edge. It has its own directory service and deals with intermediate problems such as NATs (which hide the machines inside a network) without having to wait for others' to do it for them. I find it to be an interesting contrast with "traditional" VoIP which had adheres to SIP (Session Initiation Protocol) and other specifications in what I call the "TawKi" (Telephony as we Know it) model. I don't mean to draw too sharp of a distinction since Skype is currently a closed community and there is indeed a lot of room for innovation within and around the SIP world.
Skype is just one of many P2P or "Peer-to-Peer" efforts. The existence of the P2P community is telling since the Internet itself is fundamentally Peer-to-Peer. The Internet is based on the end-to-end principle which departed from traditional telecommunications by moving seemingly basic functions to the edges. As the Internet became popular a number of impediments such as firewalls and NATs compromised this architecture. It became too much like the networks it replaced -- anything that wasn't necessary to support the existing applications was discarded.
Despite these impediments the P2P communities managed to work around these problems though with varied success and only limited synergy. Blogs are one example -- most people host them on servers. The term server is asymmetric but the blogs themselves all operate as peers -- you don't have to go through blog central though you can chose a blog service for your own convenience.
"There is one thing more powerful than all the armies in the world and that's an idea whose time has come"--Voltaire
The Internet is about an idea and not a technology. It is the idea that we can define meaning at the edges of a system. The current Internet is a prototype. Its edges are shared computers or even whole networks and it assumes these edges just don't change very rapidly.
Software gives us the ability to redefining the meaning of closed systems by allowing us to define new rules for using the technology. We can then interact according to our rules rather than the providers definition. Even within the Internet community we see the P2P community wresting control from those who are delivering particular technologies under the Internet brand. As with much of what we've achieved the partial successes work so well people seem to find it hard to believe how much more is possible and, instead, tend to protect the new status quos.
Each change is, in itself, threatening until it becomes the new status quo. Any time we need to ask permission or simply require others' support and investment we find ourselves stymied or, at very least, can proceed only at a limited pace. We don't have to ask permission as long as we can build on what is already available by defining the meaning at the edge. The digital technologies, including the ability to share standard protocols and tools, give us added leverage. Most changes fail but damage is limited since those at the edge would normally ignore that which they don't like -- ideas only spread if they make sense to some community but they don't require everyone to accept the change.
Getting the Edge
We need to understand how we build systems at the edge and what we can do to facilitate the process.
As part of enabling the edge we need to resurrect the initial vision of the Internet an end-to-end or peer-to-peer facility. The current Internet discarded what seemed to be essential networking concepts such as circuits and reliable delivery. Today's Internet defines the edges and a particular approach to routing traffic that entails tracking the edges in service of routing. The concern about the insecurity of wireless links should extend to the entire infrastructure. We've tried to patch over these problems with the DNS and by growing the size of the address but these don't address the fundamental issue.
In the 1970's few people had ever operated a computer. The idea that we could use digital concepts to create a word process or telephone at the edge wasn't at all obvious. Computers were simply too expensive for "trivial" uses such as replacing typewriters. As the costs came down we needed training wheels in the guise of the current Internet.
We are now ready to move beyond the training wheels and treat the end points themselves as just abstractions. We can choose an end point identifier ourselves -- just a very long number. We can consider any pair of end points as defining a relationship. It's all very abstract and very simple.
We get behavior akin to the current Internet when we send packets of bits between the two end points. The end points need to find a path for exchanging packets. The relationship is indifferent to the path and thus the actual routing can be very dynamic. How the nodes find each other and maintain the path -- that's a technical detail which I will write about elsewhere. It's the same problem we faced with the current Internet -- it wasn't obvious that one could get reliable traffic over an unreliable transport. It was even less obvious that one day we'd be able to use that network for voice conversations. In terms of 1970's technology it wasn't feasible but the Internet and the technology evolved to meet each other.
The intrinsic capabilities of today's Internet are hidden beneath its impediments. By building upon an abstract model of connectivity rather than the accidental properties of today's Internet we can unleash these capabilities.