Be warned, this is more about my view of the show than a gadget fest. It’s more a dump of ideas than a simple set of observations since all the issues are intertwined. So this is not for all readers. I will try to revisit and unpack individual themes in other essays.
CES (The International Consumer Electronics Show) is a big tent because so many of today’s consumer products are based on or related to electronics. While televisions (whatever they are) are still front and center there are also washing machines and backpacks.
As a show the CES is dominated by the current artifacts of technology. This is wonderful press fodder and can excite the imagination. But I want more.
I approach the show as an opportunity to get more than press clippings. I want to understand what is going on behind the curtain and the show affords me an opportunity to find technical people who can help me better understand the underlying capabilities and the directltion of the industry.
Technology and business models interplay both in creating opportunity and in shaping what is viable.
I’m writing these comments after digesting the whole of CES rather than reporting from detailed notes. One theme implicit in this and other writings is the idea that businesses have a genetic makeup that aligns with certain business models and not others. Changes in technology force changes in markets that require companies to operate outside this comfort zone and they often if not usually fail.
Lurking in the background is the real impact of bits and the idea that meaning comes from context. This is a theme that is fascinating me. It plays out in questions like what is a TV. As these products are designed by software the product categories must change and even the concept of categories changes.
As we’ll see old line companies and industries are having challenges coming to terms with these changes
The Center Hall was dominated by companies making and selling televisions. A television used to be a well-defined device consisting of a tuner to pluck signals out of the air and a screen on which to display the picture.
Taking apart the TV
With Cable TV the signal came from a set top box. The set top box, in effect, replaced the tuner but TVs continued to have tuners because people expected them.
A computer screen was similar to the television in many ways but the crucial difference was that the screen was refreshed far more rapidly. Broadcast TV relied on the screen’s phosphor to “hold” the image until it could be rewritten a fraction of a second later.
In the late 1990’s electronics allowed any screen to be used for computers and “TV” purposes.
The components in TV (and other products for example) are likely to come from third parties. A company seeks to add value either in the capabilities or the brand. As the value comes from software the niches in this value chain get remixed and even disappear.
The Television Value Chain
A computer monitor is basically a commodity. Though there are many variations it is hard to justify a high price.
TVs on the other hand have all sorts of capabilities and remote controls even if, internally, they aren’t much different from a monitor.
The TV industry tried to embrace FireWire, AKA IEEE-1394 as a technology that allowed one to watch “content” on screens and manage the media libraries. It was another take on the component entertainment center from the days of Hi-Fi when we’d plug the record player and tuner into an amplifier. The new devices are more sophisticated and need more than just a simple wire. The problem is that IEEE-1394 incorporated application knowledge in the protocols thus, I argue, dooming it.
The lesson of computing and the Internet is that decoupling system elements is an essential part of the new landscape. It’s a lesson that is hard to embrace as each product group and standards group is isolated with its silos.
We see this in the crop of new “smart” Internet-connected TVs with all sorts of widgets. At home my Verizon set top box sports widgets (from Yahoo?). They are pretty lame. Same for my TV as both vie to own the relationship with me.
It seems as if every TV manufacturer is out to make the best “Internet connect smart TV” but what does that mean. Obviously they can run applications. One of my first stops was LG (once known as Lucky Goldstar). I did find an engineer who, with the help of a translator, gave me his card so I could get information on the SDK (Software Development Kit).
Since I have a Samsung TV I was especially interested in the software inside. With my own TV I encountered inconsistencies and problems updating the software in the TV. I did find a support person who was willing to help me. It seems as each of their TV has its own software rather than providing a consistent platform story. I confirmed this in conversations during the show.
The term Internet TV and Smart TV seem interchangeable because the value comes from what one can do with the network connection.
At Sony they have their Broadband Internet TV and their Sony HDTV powered by Google TV® (to use their name). The latter was very similar to the Android device sold by the Logitech as their Revue except that it contained a Blu-Ray player. But the Sony booth people were unable to sell it as other than an accessory to the Sony TV even though it was obviously a generic player.
Instead Sony wants to put their mark on it and provide a consistent user interface across both devices. This makes sense for Sony which has been more about marketing and branding than manufacturing.
It also brings to mind the days before Windows when each manufacturer tried to put their face on MSDOS (the command line operating system) but they tended to make the experience worse.
As with Windows there is an advantage to the user learning the common elements of the Android user interface across devices.
Aside from the different skill sets needed for software devices and hardware devices the two elements have different lifecycles. All-in-one products have a niche but lose value rapidly as the elements compete across different markets.
This is why I expect that the real battle is over which boxes provide video input.
Huh? That’s completely backwards. That question made sense fifty years ago when TVs were the center of attention but today they are just surfaces upon which to view content and interact with other devices.
We’re still thinking in terms of reasons to buy TVs in the same way that the telecommunications providers consider the purpose of the phone as selling “minutes”.
In fact that’s just the attitude I found a few months ago at a small developers meeting sponsored by Nokia and Intel to talk about their MeeGo operating system.
Intel sells chips but at CES they have to show what you can do with the chips. They do show the Google TV devices which, unlike the other Android devices, are based on Intel’s X86 chip.
The word “television” also refers to the business of delivering content, which is TV programming and shows so the boxes are first judged in terms of their ability to act in the role that the set top box and tuner played.
For now the Android devices (Google TV) and MeeGo are measured this way. Boxee1 is another contender. Boxee is actually a software platform that is aimed at taking various sources from the Web and making them uniformly accessible.
With “Over the TOP” TV content is delivered over IP. The end point can be a traditional set top box or it can be a computer be it Windows, an Android or even a computer inside the TV.
Apple is also bringing the iTunes ecology to the TV with Apple TV which is current limited to content delivery from various sources rather than running applications.
Let’s not forget Yahoo and their widgets along with their WDK (Widget Development Kit). Yahoo is providing software capabilities both inside and outside the TV. They can do a better job supporting legacy devices such as set top boxes but it’s unclear if that is a sufficient advantage if it limits what features they can take advantage of.
Of course they are all doing there App stores with approved software.
There wasn’t much presence of other contenders such as PS3, Xbox and even Windows though they are all contenders and Windows is already being sold as a TV front end but there are currently few PC’s suitable for placement next to the TV.
Telecom Italia is interesting. They were showing at the Intel booth. They are selling their own MeeGo-based box. For Verizon or Comcast (or, in beta, Cox) VoD I need to have a subscription to their cable service. The Telecom Italia representative said they have no such requirement. I could just buy a box (or maybe just software). I may be reading too much into what the representative said but the company itself might no longer care about owning the wire at all.
This is a heady mixture of trends that I have only touched upon. Suffice to say that the TVs on the show floor only hint at the changes to come.
And we haven’t even started to explore what else we can do with such screens beyond 1950s static television or the 1990 dream of faux-interactive TV.
The term “3D” is ambiguous because it is also used for representing 3D images on a 2D screen. In this context it is used for stereoscopic viewing to give the illusion of depth.
Moving on let’s not forget 3D which was the big thing last year. It had far less of a presence this year as the industry sobered up after last year’s exuberance driven, in part, by Hollywood’s fantasies in light of the success of Avatar.
I became a believer in 3D myself but not as a driver of the industry. As screen refresh speeds increase adding 3D capabilities is merely a matter of updating software and using shutter glasses that respond to an out-of-band IR signal. For comments on 3D see “3D View”.
Things are a little rockier now. Rather than waiting for the price of glasses to come down some manufacturers are using polarized screens so that you can use the same throw-away glasses used in movie theaters.
For the now I consider the Fuji Real 3D as the mainstream 3D cameras but others are becoming available. There are 3D-ready video cameras that can be upgraded with an adapter. There are also software efforts to take successive images to create a 3D view. Kodak is also printing 3D for viewing with colored glasses. While I’ve been disappointed by Fuji’s lenticular printing service I look forward to improvements in that technology so I can print images of my wife’s fiber art.
3D also has problems. My Asus 3D monitor can’t be used for 3D content from my STB. The mechanics of 3D limit the creative aspects of movies and it’s hard to share the screen with multiple 3D images and sources. Stereoscopic viewers have been for nearly two centuries. The difference now is that we are starting to be able to assume interesting hardware and, again, it’s up to the software to explore the possibilities.
Digital Broadcast TV
The effort to adopt advanced standards continues even as IP connectivity moots it. Even though I have issues with Verizon’s LTE it makes far more sense to use it as a video transport than continue with the idea of broadcast TV predicated on the assumption that we all want to watch the same thing. It’s a little late in the game to invest much effort in one-way signaling when we can do so much more with a full bidirectional relationship over IP.
At least it would make sense for video if Verizon didn’t render the service moot with usage caps (PC Magazine).
Tablets Schmablets and such
Yeah. It’s easy to build a tablet; harder to create the right experience. The iPad set a high bar but it has its limits. Motorola showed Android 3.0 (Honeycomb). One of their own engineers who happened to be attending asked about putting multiple applications on the screen at once. Not an option. I find this troubling. Sure it’s simpler to do just one thing at once but it reminds me of the attempt in Windows 2.0 to force tiling rather than allowing arbitrary placement of windows on the screen.
At some point the industry needs to grow up and embrace a variety of form factors and recognize that the screen is more than just a window into the only thing you are doing at the moment. This is why I look forward to an effective Windows tablet. But I haven’t seen any yet.
I do have the Windows phone and didn’t learn much new about it at the show. I find it annoying that Microsoft is going too far to imitate Apple’s lock-down strategy rather than building to its strength in empowering developers.
As an aside one surprise announcement is that Microsoft is going to make Windows available on the ARM chip. The ARM has a very interesting history but is relevant here because it is better suited for low power applications but Intel is fighting back with multicore Atoms – little things that have a fair amount of power.
Motorola is also positioning their Android phone as a component platform by having a laptop shell into which you can place the phone and turn it into a laptop. It will be interesting to see how Android is able to adapt to all these combinations.
Tablets are interesting just as the original Palm PDA was in that they take the capabilities we get from our computers and make them more available.
Why, for example, do we still have light switches when I can have a smart surface that provides me with a better way to interact with my home? It could take up the same space as a light switch but with a richer interaction. Or it could be the device in my pocket.
Which is a good segue to …
Smart and/or Connected?
Warning, this section is very opinionated as I see no learning from history. I’ll start with the word “smart” which is an unfortunate term it gives the impression of intelligence when we’re simply trying to talk about devices that aren’t totally inert. A power grid doesn’t need to be very smart to report that a circuit has failed yet by using a term like “smart grid” we set very high expectations and don’t provide mundane capabilities as we wait for these fantastic expectations to be met.
I’ve long been interested in connected devices and home control. In 1997 I wrote about open interfaces and more recently wrote about a concept I call ambient connectivity in which we can assume that we can simply exchange bits with other devices without having to negotiate with a provider.
The big news for connected homes this year is energy management but nothing has really changed. It’s still more of a gimmick than anything real. The power company is not going to go into your home turning on and off appliances.
We still have Zigbee fighting it out with Z-Wave with other IP-based efforts sitting on the sidelines. We’re still doing property protocols or stacks or profiles or whatever you want to call that work only within their islands,
The idea of funding the development of a protocol and controlling it by licensing doesn’t work very well in a connected world because it prevents exploration of new ideas. It feeds the kind of inwards spiraling that we see in The Innovator’s Dilemma. Z-Wave has taken this take for home control. DLNA (Digital Living ...). DLNA builds upon UPnP which hasn’t gone very far because (in my opinion) Microsoft didn’t go with my approach.
My approach was to create a generative dynamic as I did with home (data) networking rather than doing a solution that only worked in special situations within islands as I wrote in Maker Disconnect.
Putting on my ICCE hat for the moment the problem goes to the heart of a standards process driven by short term corporate needs. It’s hard to develop simple enabling standards that enable sharing. One reason for the success of personal computers is that they had no one purpose. They might’ve started out as game machines but that interactivity allowed us2 to turn them into financial tools and, over the years, so much more.
We’re also stuck in a command and control mentality rather than embracing peer relationships and best efforts as a constraint.
One of Verizon’s big announcements is its new effort to reenter the home to turn day-to-day life into billable events as they control energy usage and all else for you. The effort shows no learning from history as the company fails to escape from its own business comfort zone.
Speaking to one participating company it was discouraging to see how much of the architecture was driven by purely accidental properties of today’s 3G and LTE implementations.
This speaks to the larger problem of old line electric grid companies trying to understand the new opportunities. At a recent talk I heard at MIT the energy companies are paying incredible amounts of money to the carriers merely to exchange a few bits for a less-than-stupid grid. The term “smart grid” must be avoided since it’s more about mongering than creating value.
Verizon is spending big bucks on LTE creating yet another “broadband” path. It seems as if they are throwing away a perfectly good fiber plant they just built by saying you can do it all over their wireless network. They aren’t quite saying that but then again why is Verizon ignoring all their newly deployed fiber to go to distant towers? It’s indication that they aren’t facing effective competition – just the FCC’s illusion of competition.
They and others start lamenting the lack of “spectrum”. If they simply put access points along their fiber path there would be no shortage. But that would undermine their business model which is based on keeping bits scarce in order to create value. There’s nothing secret about the plan. What is surprising is how little awareness there is about what bad public policy it represents.
As PC Mag reported data caps and LTE don’t go together. So if you can’t use LTE’s speed what is its purpose? In the example of energy monitoring the rationale for LTE is based on limitations on the implementation of IP over Verizon’s 3G network though LTE isn’t that much better.
This is not to disparage the Verizon staff. The engineers and others are trying to do a good job and I hope to follow-up to solve problems I have working around their FiOS implementation.
But it does bespeak of the habits of a company forged in the foundry of legislation in which it doesn’t face real competition and has become dependent upon layers of control. It can build yet another physical infrastructure rather than taking advantage of what we know about Internet connectivity to make far better use of its current resources.
Lest not forget the primary purpose of sending all traffic through cellular towers is to generate billable events and the cost of having parallel wireless and wired systems is (as I’ve written) nothing more than the cost of billing to cover these very same costs of billing.
For more on connectivity you read my attempt to Demystify Networking.
Medicine and Radios
I got a chance to learn about ANT+ and Continua. In some ways this is the Zigbee profile all over again but I’m more sympathetic because the combination of the two is interesting. ANT itself is a variation on the low power Bluetooth radio. ANT is more like Wi-Fi in that you can deal with the radio on its own merits. At the moment you have to get a license to build an ANT radio but that is likely to change. This means that any device with a Bluetooth radio can also support ANT.
ANT is being adopted by medical device makers so that you can connect a scale to a health monitor. For now there are other families of radios that each single hop connections. ANT+ includes profiles for various purposes.
Continua is a consortium to share medical information over IP. I don’t know the details but the idea is a step in the right direction. There are bridges between the ANT+ and Continua world and between the ANT radio and Wi-Fi.
I would like to see this simplified so that wireless networks aren’t all that special. We need to (again) learn that decoupling the protocols from the radios and other factoring is vital. Having a large market for the radio lowers the price and creates an opportunity for dynamic evolution of standards. Even better if the standards are adopted rather than imposed.
More to the point we need components. If the ANT radio has benefits over the Wi-Fi radio then we should embrace it as a complementary approach though having a single radio that can manage power would be better.
Just before the show Qualcomm announced it was buying Atheros, a company that manufactured relatively open Wi-Fi chips. Having a limited number of suppliers for the chips as well as consortia selling the chips is a topic in its own right.
I was reminded of this in speaking to OOMA which sells phone services through their box. For a while they were using HomePNA which I was partially responsible for but they told me they dropped it because it was unreliable. I purposely set modest performance goals for HomePNA in order to increase reliability. But Broadcom which wants to sell high value chips wanted it to be an entertainment media so emphasized high speed over reliability. I wonder if their desire to sell entertainment chips caused a decrease in reliability. This interaction between perceived markets and technology is a theme through this essay.
I noticed that GE, Whirlpool and LG were showing their new smart appliances. I also noticed that the washing machines looked remarkably similar and assume that they are all made and designed by LG.
While the machines are indeed becoming smarter I didn’t see much indication of an understanding of connected devices. The simple example I use is that I should be able to write my own program to detect that the wash is done and send me a message. Nothing fancy here but very useful.
Cars with OSHA warnings (because the audio can blow out eardrums) are an old tradition at CES. But cars are becoming “smarter” and becoming software platforms.
The car companies are being very cautious about opening up the APIs but it is happening. For example Ford now has an approval process for applications such as Pandora that can be installed in cars and upgrades the software.
QNX is a Linux variant that is now owned by Blackberry which uses it in its tablet (which is an interesting tidbit in its own right). I was surprised to find that it’s also in cars like BMWs. GM OnStar is now available as a service apart from GM cars.
Things have progressed since I was in Microsoft’s “auto PC” group in the mid 1990’s and it’s a slow process in an industry that is hemmed in by regulations and caution. But the trend I’ve seen over the last year is to factor out the platforms and to figure out how to make the capabilities more available as well as take advantage of the software capabilities.
Lots of one-offs such as glasses that go between two sets based on an electric current as an alternative to bifocals (or trifocals).
Each year there are areas with a large number of booths from Asia, mostly China. For the most part they are simply variations on common products but there are also interesting ones hidden among them such as a jacket with a simple electric heater like an electric blanket.
At the Sharp booth I saw LED lights that appear to be starting to make sense for the consumer market. They are still pricey but are bright and dimmable. I’ve been trying LED lights for years and ordered two (though not necessarily from Sharp) which may finally be reasonable for my kitchen While the lower power is attractive I’m far more motivated by not having to change the bulbs for another 25 years.
At the Panasonic booth I learned a little about the display marketplace. Panasonic has been selling plasma rather than LCD displays. While one can argue the merits of either technology apparently they are planning to go all CD for the smaller size for the simple reason that plasma would require a new manufacturing facility for that size and it’s simply not economical to retool. Once again market size triumphs.
The down side is that the sale of LCD TVs means that computer screens are less suitable for reading. Instead of the 1920×1280 suitable for reading a whole page the market is now dominated by 1920×1080 which isn’t quite enough. Sigh.
Best Buy is a sponsor of the show providing badges for the press and bloggers. This seems strange since they should be buyers and presenters but the CES is also a show for the press and Best Buy, as one of the few remaining retailers, benefits from CES being free advertising for the products it sells.
And that Darn Internet
The Internet itself is still treated as a CDN (Content Delivery Network). The full impact of the Internet as providing peer connectivity among devices is still not felt nor is the full potential of software.
This is where I hope the IEEE and the ICCE can provide leadership for the future. But it’s hard since analog electronics and the days of spectrum are still central to IEEE’s conception of itself. Bits decouple the artifacts and devices from the bits we use to exchange and encode information.