Return to TauZero.comReturn to Rob Tow"Kava" Open Source Wearable Computer

In the summer of 2000 I joined AT&T Labs' "Menlo Studio", a group whose charter was to do prototyping of internet devices. When I arrived, it had produced one successful prototype - a small firewall appliance - and was working on a second - an internet radio. The group had also produced a number of concept mockups of futuristic "edge devices" - portable internet devices, some of them wearable, and some wireless.

As I settled into the group, it became apparent to me that the radio project was struggling mightily. The design had specified an under-powered CPU, and the team was sweating bullets and assembly code to try to get it to work - the expansion of the compressed audio stream used more CPU cycles than they had initially estimated.

I thought about this, and it was apparent to me that the radio design was too close to that of a real product - it was prematurely optimized. It also didn't have a lot of commonality with the previous prototype, although both were ARM based Linux systems.

At this time I was proposing the prototyping of an idea of my own - the Fair Witness video escrow system. I decided to push for a radically different approach to prototyping the sorts of systems that Menlo Studio was trying to create - an approach that would enable prototyping many different ideas and systems in a short period of time - six or ten in a year instead of one to two.

My core idea was very simple - and it did a structural inversion on the design approach previously used by Menlo Studio. Instead of starting a prototyping project with a specific intent, and then designing capabilities to match that intent, I proposed starting with lots of capabilities in one general purpose platform - and subsequently using subsets of them to realize specific intents... intents unknown until later.

This approach would mean that any given prototype would be over-designed, and too expensive to be a product - it would have unused capabilities under the hood. But the key would be to rapidly prototype using a truly general purpose tool, and explore - and then iterate on the design - producing a series of descent-with modification Darwinian evolutions from the original system. I was confident that we would think of things to do with such a platform that we didn't have any inkling of initially - and that if we gave this platform to other groups inside and outside of AT&T Labs that many intents would flower - and that a community would emerge around shared code and systems.

So I proposed that we build a computer system that was the most powerful possible using commercially available parts - no research involved, just system building - and that it should be small enough to be wearable - because you can always pretend that something small enough to be wearable is big (you can just put it on the desk), but you can't pretend something big is wearable.

The next step was to figure out exactly what capabilities to build - what sorts of kitchen sinks to put in. I asked the entire Menlo Studio group to spend two days in a classic brainstorming session to think about this. Challenged to think of user experiences and affordances that they personally would like to have from computers and communication, the twenty odd engineers and researchers came up with a list of over 300 user experiences that they couldn't currently have but wanted. There were no constraints on the brainstorming; it was pure need/want/desire, with no consideration of what was technically or economically reasonable.

Following this Bob Alkire and I spent two weeks doing a structural analysis of the brain-storming's results. First we clustered the user experiences, and derived axis' along with they could be considered excursions - e.g., fixed vs. mobile, solitary vs. social, realtime vs. stored and recalled. We came up with eleven such axis.

The next step was to describe real technology that was able to support as wide a volume of excursion across all of the axis'. At this point user experiences started to drop out - because some of them required magic, or unobtanium of various sorts. The sorts of technological underpinnings we listed included things like networking, both fixed and wireless, 3D graphics, CD quality audio, spatialized audio, serial ports, telephony, IO for standard devices like keyboards, and for exotic sensors... accelerometers, and more.

At this point the detailed fun started - picking chips. We looked at the fastest available parts that were low powered enough to be considered for a compact wearable computer; a RISC chip from Alchemy and the Intel XScale.

We decided that we wanted to make an Open Source system. The strategic reasons for this were to create a large community of practice around our design.

We picked the Intel XScale CPU and a twin CPU AMD DSP. We looked at RAM and Flash, and picked the biggest commercially available parts. We constrained the size design to that of a Palm in X and Y, but allowed stacking of daughter cards,

Intel's bridge chip for the XScale didn't work, but we heard that Intel was working with ADI Engineering on an FPGA replacement. We got in on that development, and got the alpha code from them, and Bob Alkire started playing with it. We hired Set Engineering to do detailed board layout for us. Sandra Carrico worked with us to manage the project.

By this time - late summer 2001 - we knew AT&T was in trouble. We wanted to look good, and not get hit at the end of the year - I predicted lay-offs in December - so we optimized everything for time, not money, and PUSHED.

We worked to find customers for our work. We found groups within AT&T that wanted to port speech recognition and speech generation software to "device world" (e.g., PDAs). We also found receptive audiences in academia, a major Japanese electronics manufacturer, and in AT&T Wireless.

Each of us on the Kava team picked a couple of prototyping projects we would do once Kava arrived. I signed up to do two - implementing the Fair Witness video escrow functionality, and a spatialized audio email UI that utilize the cocktail party effect, and transform email and voice mail from linear traversals to 2D maps. I also sketched out the design for a third prototype, one that would depend on E911 or GPS tracking plus physiological monitoring, using our extra IO ports - a personal trainer AI that tracked joggers across terrain and advised them on their pacing, using speech recognition and synthesis, and tailored cardio training combined with look-aheads on stored maps.

December 2001 arrived. Set Engineering got our first boards up and running - and we got the news that the entire Menlo Studio lab was being laid off. No one back east even paid attention to our success... it was too small to be noticed amid the hemorrhage of money that was AT&T.

Kava became shelfware.

Shortly after, the three of us (Bob Alkire, Sandra Carrico, and myself) asked for the rights to the design, since it was being thrown away... and the result back east was "Huh? You mean it's worth something?"... and they asked us for a half million dollars for the design. We offered them a "piece of the action", if they let us have it... and paralysis resulted.

Later, in 2004, Bob Alkire and I found ourseves at Sun Labs - where we reworked some of the ideasfrom Kava into the design ofthe Sun SPOT wireless sensor network system.