Wednesday, October 07, 2009

Any more touchy, and these mice could turn into French chefs!

It's no secret that Microsoft hires the best tech researchers out there by the truckloads, and quite possibly, has a research lab with an intellectual and creative heft that could compare with the legends of yore -- IBM's Watson, AT&T's Bell Labs, and Xerox's Palo Alto Research Center. Of course, it still continues to dish out stinkware such as Windows Vista, but that despite the efforts of its brains trust, much of whose work likely never sees the light of day. Perhaps in response to Apple's recent leak of a touch sensitive mouse, Microsoft has released a video of five different types of touch-oriented mice. The demo is really, really neat. I'd love to own one or more of those toys, just to play around.

In the grand scheme of things, mice-like tools really belong to a paradigm that's showing it's age; after all, the great St. Doug Engelbart (may his tribe increase) invented the first mouse in the 1960's. Maybe this is the mouse's dying gasp, its last hurrah. If so, what a wonderful hurrah it is. I hope Microsoft sees it fit to release some of their touchy rodents into the wild; who knows what genetic transformations could emerge from such an experiment?

Thursday, July 09, 2009

Wednesday, July 08, 2009

Tuesday, July 07, 2009

Sunday, July 05, 2009

When Disney Met Dali

From the text at Monsters And Rockets:

In 1946 legendary surrealist Salvador Dali formed an unlikely friendship with Walt Disney, and they spent some time collaborating on a short film called Destino. Dali and Disney artist John Hench worked on a lot of storyboards, but only 18 seconds of test footage were shot before the project was abandoned.

In 1999, Disney's nephew Roy Edward Disney was working on Fantasia 2000 and he decided to complete the Destino project, over 50 years after production began. 25 Disney artists worked from the original storyboards (with some input from Hench himself, and notes from the journals of Dali's widow) and finally completed Destino using a mix of hand-drawn and computer animation. The 18 seconds of test footage were included, in the shots of the two weird, turtle-like creatures seen above.

Destino didn't end up as part of Fantasia 2000 and hasn't been widely screened. It was seen in theaters with the films Calendar Girls and Triplets of Belleville, but so far it hasn't been released on DVD. It's amazing to look at, but I have the feeling that the imagery in Dali's own version would have been a bit more disturbing. (Notice how those turtle monsters kind of stand out from everything around them?) It's also a little funny how the Disney artists just can't resist making the dancing girl into a Disney princess. There are a few shots in here that look a bit like Belle in Dali Land.

Who's on foist?

A timeless classic that entered the popular lexicon ...

Saturday, July 04, 2009

The State is NOT The Community

Over and over again, I see writers -- who ought to know better -- being unable to distinguish between Communism and Community. Communism is where the State arrogates all power to itself and in fact, does everything to destroy community. The Communist State would like to make like to make every citizen completely dependent on it for providing everything. The State decides what your duties and responsibilities are and also provides for all your needs. In theory. Community, on the other, is a phenomenon that emerges through natural processes from the grassroots, when a group of people find themselves in a situation where they are faced with a common fate. Community is not imposed from on high, unlike a Communist State.

Open Source and Social Media represent and encourage community-oriented phenomena, not Communism, as their detractors often allege. Both the Totalitarian State and the Powerful Corporation are inimical to the welfare of the Community. Communities should strive to make the State and the Corporation subserve, rather than rule over them.

THIS, is the REAL first iPhone 3GS music video

... or so they say ...

Thursday, July 02, 2009

The thrill is gone, baby ...

Is it over, already? Man, I don't believe it ... maybe I'm just burned out a little, need to take a break from Twitter. Yeah, there've been a few good links to look up, and some nice, pithy sayings that have been keepers. I've linked up with some interesting people, and had a few Twitterversations. Still, I feel a bit jaded. Not sure why.

Though I'm 'following' nearly 190 persons, it feels as though most of the posts come from a handful; almost certainly, not all 190 are posting, and even among those who are, it feels (that's my impression) as though no more than about 20 are doing so frequently.

I guess it feels like I'm meeting the same ol' folks saying the same ol' kinda things that I expect them to say, given that I'm beginning to discern a pattern for each. It's interesting that one unconsciously begins to construct a certain unique persona for each individual, as a gestalt of all their tweets. The tone, the content, the syntactic structure, the tweeting frequency, and the pattern of tweeting (quick bursts of many tweets, periodic tweeting at fairly regular intervals, or random intervals) as well as the avatar image used, all contribute to the complexion of the persona that emerges from the tweetstream.

Guess I need to follow some more people, although I wonder if I can handle that.

It is a cocktail party. Or maybe it is like one of the long train journeys we used to have when I was a child, which often took up to three days to complete. Along the way, we would befriend fellow passengers, have interesting conversations, share food, assist one another (especially in keeping an eye on kids and belongings) and in general have a great time. And then we'd get off the train and never see or hear from them again. Kinda sad, but one's life was enriched anyway, and it made for a memorable journey, making one eager to embark on another one ere too long.

Yes, Twitter feels a bit like that. You overhear scattered fragments of conversations among strangers and a fair bit of it is fun. There is bustle and noise, and a sense of movement. People stream in and out. Passengers get on and off. There are moments of quietude, and then bursts of activity and sound. Some stuff is funny, other stuff is dreary, a bit of it is boring, and then there is some pretty interesting stuff.

You'll likely never come to know people completely, just the side of them they choose to reveal during the journey.

The thing is, the train keeps going on forever, even if you have to get off at some place. And you know that the train didn't start at the place you got on, it's come from a long way off, been running for a long time, and it's final destination is way beyond where you will disembark. There are a lot of people on board, and you will never get to meet them all; and some you'll never want to meet.

Maybe I need to visit some other compartments (carriages) and wander around in previously unexplored parts of the train.

Yeah, that's what I'll do. Next time. Maybe. My brain's tired from all the listening and tweeting and trying to make sense of it all. Good night.

Distermediating Media (or Media for Disintermediation)

Likely the term used most often in an unintentionally ironic manner is the word disintermediation. And in the past year or so, it has become among the top buzzwords in use, flung around freely from every pulpit everywhere, typically delivered in a booming, authoritative tone. Disintermediation is what the New Media or Social Media is all about, we are told. Disintermediation takes away the media so that there is nothing -- nothing -- that remains between you and the thing with which you choose to interact. Take this one, for instance:

Social media is a de-institutionalising and disintermediating force. It gets rid of institutionalised functions. This is the lesson from every sector it has touched. In music it has got rid of the music business (and the creation and sharing of music has flourished). In news it is getting rid of the news business (and the creation and sharing of information is flourishing). In government, logically therefore, it will get rid of the government business.
Ye-ep! Social media disintermediates! You read that right! Now how in heck does it manage that? Perhaps in the manner a dog chases its tail or a snake swallows itself?
Web-ons of Mass Disintermediation: Once the world is completely disintermediated, then we shall all be free! Liberte! Egalite! Fraternite! Revolucion! Che! Etc.!
Hey, it happened in Iran, didn't it?

A search on teh google for the term disintermediation came up with 231,000 hits [June 2, 2009; 8:09 am GMT]. Bing, on the other hand, delivered 208,000. The difference of 23,000 disintermediations seems to have gotten intermediated somewhere in the vast search space separating the two search engines. Interestingly, the Wikipedia definition came up first on both lists -- I'm figuring that Wikipedia is the most popular medium of disintermediation out there. [Disintermediate produced 55,000 hits on Google and 18,400 on Bing; disintermediating gave 61,300 and 12,400 on Google and Bing respectively. Wikipedia continued to rule.]

Google! Hey, there's a disintermediary (17,100 Google hits, but only 112 on Bing! Bing! and all the intermediaries crumble to dust), if there was one! Google eliminates everything that stands between you and the information you seek, right? Right?

A discussion of the larger problem with definitions and pronouncements is in order especially because of the currently raging controversy over the book, Free by Wired editor Chris Anderson, sparked off by a none-too-positive New Yorker review by superstar author Malcolm Gladwell and a plagiarism allegation. This controversy is being discussed all over the blogosphere (and is being disintermediated by Wikipedia even as we speak; 286,000 hits for the search string "chris anderson malcolm gladwell free").

The issue is what Anil Dash colorfully calls airport books: these are easy reading fare usually written intelligently and engagingly by brand name authors, and dealing with topics of broad current interest but written in a manner that is appropriate for cocktail party conversations. The material presented usually makes reference to scientific research and pithy, easily remembered (and quotable) conclusions are presented as if with authority; very often the language and phrases from the book quickly enter the general idiom and become part of folklore, accepted without the need for proof. Nevertheless, very often the pronouncements made are quite shallow and don't stand up to intense scrutiny. Cross-questioning by intellects of such stature as my hero Richard Feynman would likely cause the theories and pronouncement to crumble to dust instantaneousy. Indeed, the writers often take advantage of the fact that public memory is short and predictions made in the book are (to the relief of the authors) quickly forgotten, to make way for new ideas and new books (of limited shelf life) that come streaming down the airport aisles. The books, in fact, could be considered at best to be medium to high-brow entertainment, meant to tickle the mind and provoke thought and discussion. Unfortunately, especially due to the authors' reputations (and the extensive references in the books), they are often treated as having the force of real, formal scholarship, and in that sense serve only to muddy public discourse.

Back to the subject, though: I blame popular writers of airport books and similar blogs for having created this very flaky buzz around the idea of disintermediation. Apparently, New and Social Media eliminate the much reviled Middle Man thereby reducing costs, increasing transaction speed, etc. But is this really disintermediation? Let's see,
mediation: coming in the middle
media: something that comes in the middle
How in heaven can media disintermediate? The reality is that there will always be a medium of some sort whenever there are two or more parties involved in a transaction of any kind (physical or informational). Unless some means of effecting transactions is invented that instantaneously generates the required knowledge and information as well as goods and services inside a recipients brain and body respectively -- I don't see that happening anytime soon. So,
Please repeat after me: Media does not, and cannot disintermediate.
What social media are achieving is to replace one kind of medium that has outlived its purpose and now only creates inefficiencies with another sort of medium that seeks to eliminate the extant inefficiencies. It is not at all likely that the New Media will remain the Gold Standard forever -- there will come a time when they too will begin to show their age and will need to be supplanted by entirely new media. The danger of falling in love with the term disintermediation is that the label will become permanently associated with New and Social Media making their introduction and use unchallengeable. They will become the New Holy Cows and anybody challenging their use, even when the media become ineffective or inefficient, will risk inviting public scorn, and quite possibly be denied the space to publicly present their views.

So may I propose a new, more appropriate and relatively neutral term that correctly captures what's going on, i.e., the replacement of one set of media with another set of potentially more effective and efficient media? Well, ladies and gentlemen, boys and girls, here it is:
But ... but ... but ... you say, that can't be, this ain't a remedy, that term's already taken, and it means something else. How about,
Don't lilke that, eh? Disintermediation rolls off the tongue, with so much gravitas and authority, that one is loath to give it up. Consider, then,
the bene- prefix meaning "good" in Latin. Or even,
Hey, I think I've got it!
Anyway, dear reader, think up some of your own. But for heaven's sake, please reconsider your indiscriminate employment of the word disintermediation for purposes that have everything to do with mediation.

Thank you!

Wednesday, July 01, 2009

First iPhone music video shot on iPhone?

The iPhone is turning out to be a stealth device ...

One week in Cloud Twitterland

So I figured I needed to know what this Twitter thing was all about. Yeah, I had a Twitter account (@murliman)already, got one a long while ago, needed to feel I was cool and all; but after sending out a tweet or two, I couldn't really figure out what this was all about and abandoned further efforts. Here's my first ever tweet:

exploring twitter
from web
Note the date on that. My second tweet is identical, sent at the same time. Must have goofed up, I guess. My third tweet came more than 7 months later:
Wondering why the Twitter logo(?) is what it is (o_O) Are those the eyes of a kinda stoned birdie?
from web
And that was it.

So my Twitterscape lay fallow, unhonored, unsung, unploughed. Turned out that I was something like the first guy with a telephone or email: nice, but it all seemed pretty pointless. Things turned around in early June this year. I had started blogging again - like mad - the blame for which falls entirely at the feet of my former student, Ashish Sharma (Twitter handle: @ashinertia). I figured I needed to get word out about my blogposts and reasoned I could use Twitter for that; after all there were probably a billion Twitterers out there. At the time, I had a small (single-digit) number of 'followers' and at least they could come to know about my blogs thus. My tweeting resumed with this tweet:
from web
I still didn't know what Twitter was all about, but figured I'd just jump in and find out. In this quest I was aided by @marcynewman who tweeted me, saying, 'Hey, you're tweeting!' or something like that. To which I responded:
@marcynewman :-) just experimenting; was getting frustrated that I didn't know how to use this dang thing. You UberTechi, you!
from web in reply to marcynewman
Now @marcynewman is a major techie even though she claims I'm the one who made her technical and all. She is the quintessential technobabe which term I first heard used by my former colleague Robert "I'm not Bob" Minch when referring to someone we mutually knew. @marcynewman's body is covered with technology; she could walk onto a sci-fi movie set without any additional props or gear and fit right in.

For the next one week I posted 17 tweets ('updates' in Twitterish), 7 of which were a conversation between myself and @marcynewman (which could have happened through chat or direct messaging; still trying to figure out Twitter) and the remaining contained links to blogposts I had created; yes, I was blogging like crazy, compared to the previous years.

Then it all died out as abruptly as it had started.

I'm told this is a standard pattern among Twitterers; the Twitterscape is littered with millions of abandoned @names. But then another thing happened; my son was done with his examinations, and I was free. After that my tweeting began in right earnest all over again, about ten days ago, on June 22, since when I've sent out over 120 tweets; also, quite happily, I am now a certified twunkie (tip o' the hat to @durrink for that term) -- a Twitter Junkie. I think I am beginning to get it and I'll try to present Twitter according to @murliman (tat@m?).

Understanding Twitter

First of all, I think I understand the reason why so many abandon Twitter: one's level of participation in Twitterland is not a continuum of values -- either you're pretty close to being a junkie or you don't go there at all. It's pretty pointless to go there occasionally unless:
  • you're a celebrity and your fans out there are dying for a few morsels now and then
  • you like to be stimulated by random, serendipitous posts from friends or strangers
So, the occasional visitor is either mostly a Tweet generator or a Tweet consumer. Tweeting occasionally by anyone else is of interest primarily to close friends and family, and that too only if said close friends and family are Twunkies.
One might visit Twitter constantly for a limited period of time if
  • there's some major event happening, e.g., the Iran Revolution, the Mumbai Terror Attack, a conference or a ball game, and you'd like to know what's going on (or if you're in the thick of things, even send out tweets yourself)
Those who sign up for a Twitter account to find out what it's all about need to stick it out long enough to become Twunkies (if they are susceptible to the Twug - Twitter-Drug) else they mostly likely will abandon efforts. My first couple of forays were very superficial; it's only the third time around when I decided to stay the course until something magical happened. And in fact, it did. Like most addictions, I'm not entirely sure why I Tweet, but it does provide some sort of a high; and if I stay away long enough, I began experiencing some sort of Twithdrawal? - and need to go get my fix again. One article sums it up pretty well:
The Twitter Cycle: Curiosity, abandonment, addiction. Global visitors hit 37 million.
In just two years, Twitter has come a long way. And in the coming years, the social media landscape will transform wildly because of a seed called Twitter.

In some ways, Twitter was a clean sheet restart of electronic communications. Email, blogs, and Facebook had been around for a while, but they had become much too baroque, too overladen with features. They had become bulky, unwieldy, complex. Twitter was a way to return to the drawing board and start from scratch all over again.

The SMS framework in mobile phone communications seemed like a good starting point. There was email, which required computers (or smart phones) and there was SMS (for mobile phone communications). Twitter's founders thought -- how about seamlessly combining mobile phone and computer communications by employing the lowest common denominator -- SMS -- as the messaging structure?

Think about it: It takes two to SMS -- or Tweet: a sender and a receiver. Let's assume these are persons known to each other. Perhaps the tweets are perishable, standalone, and have no further value. There is no need to save or organize the tweets. On the other hand, perhaps the tweeting constitutes a conversation, and there is value in preserving them, much like email messages. If so, the only structuring mechanism needed to manage the tweets is to organize and list them in chronological order. It might also be useful to list each Twitterer's messages separately (and in chronological order). So there are three lists: the complete list of Tweets in chronological order and two lists listing only each Twitterer's Tweets (the latter two, of course, do not need to be separately maintained by can be generated dynamically from the first on demand).

Now, friends and family come to learn about this new messaging medium, and want in on the action. The total population of Twitterers grows, say to about 5. This set of five persons are all know each other, and would like to keep up with each others' tweets. It's personal, or work related, but it keeps them connected. The same structuring mechanism (chronological, and by Twitterer) suffices.

Then others begin to join the Twitterscape -- friends, and later, friends of friends, friends of friends of friends ... ad nauseum. Pretty soon, the number of Twitterers is in the hundreds - or thousands. Everybody no longer knows everybody else. Most have interest in the tweets of only a subset of the Twitternation. And so the idea of a Follower is introduced: each Twitterer chooses to follow some subset of Twitterers in the Twitternation. This select subset (unique to each Twitterer) is made up individuals known as Friends (not to be confused with Real friends in the Real world).

So where there once was a single, completely-connected cluster of Tweeters, there now are tens of millions of clusters, which in turn are connected to other clusters. Each cluster represents one individual Twitter and his/her Followers. That Twitterer, in turn, is a Follower (and hence a member) of many other clusters.

What, if any, change is needed to structure and organize Tweets (at least as viewed by a Twitterer)? No longer is it feasible or useful for any Twitterer to view the entire tsunami of Tweets generated by the entire Twitternation of tens (or hundreds) of millions: you view the Tweets of only those you follow. Following, then, is the structuring mechanism for bringing down an individual's Tweetupdates down to a manageable number (besides reducing the bandwidth consumption on the Internet by several orders of magnitude).
To manage communications with some level of sanity, each Twitterer chooses to Follow only a relatively small number of Twitterers (although some Twitters seemingly follow tens of thousands of other Twitterers -- to what end, I don't know; it's unlikely that they actually follow them all, or that all of those tens or thousands tweet regularly).

What other structuring mechanisms are available besides View by Individual Twitterer's Tweetstream and View by a Friend's Tweetstream?
Enter: Hashtags. Hashtags are an innovation from Twitterers themselves rather than by Twitter's developers. Hashtags are of the form
'searchterm' is the string one is searching for. Twitterers developed this as a means of tagging and searching for all Tweets relating to a particular issue, e.g.,
#iranelection or #pdf09
where pdf09 is the name of a recent conference. All tweets that include a specific hashtag are listed in reverse chronological order. While it is possible to search for any character string in Twitter, using a '#' prefix implies that the Twitterer deliberately intended for it to show up in a search. Typically, there is a consensus to employ a specific character string since the Twitter system does not create separate forums to deal with specific issues or events.

That last sentence is important: other social media deliberately create walls and fences for discussions to proceed and socialization to occur in finite forums with defined memberships. Twitter, on the other hand, erects no walls and intentionally permits serendipitous discoveries. Being a Twitterer is akin to wandering into a vast mall or commons, bumping into friends and strangers, chatting with some, overhearing conversations, making general pronouncements that some might hear and most might ignore. It's a like a never-ending cocktail party on a monstrous scale. There are some who successfully bring their real-world celebrity status or recognition into the Twitter. And there are others whose real-world status may be markedly different (better or worse) from the one enjoyed (or deplored) by their TwitterPersona. Take the case of this musician who earned $19,000 in just ten hours from her Tweets. Certainly better than anything she had managed in real life until then.

I, for one, am enjoying the ride so far. It tickles me to read Tweets coming in from persons of the stature of Arnold Schwarzenegger, Nandan Nilekani, Al Gore and for gosh's sake, Jack Welch! I felt Twitter had arrived the moment I found a post from Welch.

The question now is, what other ways might there be to structure the basic Twitter stream? The need for alternative structuring mechanisms is already evident in the large number of Twitter applications that have emerged during the very short life of Twitter, and more are in the pipeline. Twitter is like a vast, constantly changing terrain and Twitters soon learn that they need for making sense of this relentlessly transmogrifying space; they need maps of some kind, and the various Twitter apps help in both making sense of the Twitterscape as well as negotiate it successfully. Within a week after diving into the TwitterCloud, I felt disoriented enough to feel the need to scan the web and download tools for managing the process. I am now using two fine, highly-recommended Twitter desktop apps Tweetdeck and Seesmic Desktop, both constructed using Adobe Air, the Rich Internet App (RIA) development platform.

I can see how the TwitterStream can be the basic building block of just about any communication-based application, mostly involving humans, but not necessarily so: I can conceive of embedded digital devices 'friending' and 'following' other Twittering digital devices, or even humans exchanging Tweets with machines (the machines, of course, parse the Tweets and take appropriate actions). Hence Twitter can become a universal communication infrastructure at a level just above machine communication but low down on the hierarchy of human communication.

It helps Twitter's case that it is addictive: one blogger has found the need to publicly lament that his Twittering has left him little time to blog and that he was going to have tear himself away from Twittering for the purpose.
Like the world depicted in the movie, The Matrix, Twitter is an alternative reality, kaleidoscopic, rich, stimulating, heady, fast-paced, diverse, ever-changing, an infinite series of windows both into the real world as well as the world of the Web, which is itself, in turn, a series of windows ... It's not hard to understand how one can get sucked into this maelstrom with a much greater force than that exerted by the Web itself. There is no starting point nor is there a time and place to get off; one has to force oneself off and return to the physical world. Twitter is a world with strong connections to the physical world, but it is its own space, has its own complexion and character and is feels no less real than the one in which we eat and drink and sleep. Twitter has some of the characteristics of Massively Multiplayer Online Role Playing Games (MMORPGs), but unlike the latter, it has little or no barriers to entry.

One broader observation: I have been associated with what is now called social media but used to be known by various names such as: groupware, computer support for cooperative work, group support systems, etc., for over two decades, and have contributed to formal academic research in the field. In all these years, however, there is not one social application developed by researchers in universities and corporate labs that has found widespread acceptance among the general public. The most wildly popular social applications such as Facebook, Twitter, and Blogger were developed by young, imaginative, energetic persons with no pretensions to doing research: they built tools that they found useful to themselves, and happily, tens of millions of others found them useful too. None of these social apps came out fully developed or with any coherent theoretical model, as academicians insist on creating before they build and explore tools and applications. The apps came out completely from the unique, idisyncratic experiences of a few individuals; surprisingly, they also matched the needs of the many. Over time, with feedback from users, the tools rapidly evolved. In all these instances, theory appears to follow, rather than lead phenomena. There must be a whole lot of researchers trying to figure out why Facebook and Twitter have become the monsters they are now, but none of them could have anticipated them based on any available theoretical framework. Clearly, the current social and social psychological theories are flawed, or limited and need to be reviewed and revised.

But the situation also calls into question the value of academic research in the field designed to produce new social applications (rather than investigate the impacts of extant social applications). It is humbling to realize that there is little to show for over 25 years of formal university and corporate lab research and development in the design and introduction of social media.

I ought to emphasize that the development of core technologies such as operating systems demand the knowledge, skills, and experience of outstanding researchers with excellent credentials; shooting from the hip and designing by the seat of the pants doesn't take you very far while trying to design operating systems, communication protocols, microprocessors and so on; the most influential operating systems: IBM's OS360, Bell Labs' Unix, Digital Research's CP/M, and Xerox's Alto and Star were all written by or under the supervision of Ph.Ds. The same restriction doesn't seem to apply to applications that are built on those solid foundations. The last 15 years have shown that the best innovations in application-oriented technologies happen when useful technology building tools are widely distributed among the general population, whether or not they are technically qualified. There is an incredible amount of ingenuity out there that goes far beyond what might obtain within the walls of formally designated research institutions. Society is greatly benefited by seeding it with an array of Tools for Innovation: give 'em the Tools to Create and then leave them alone.

UPDATE (July 2, 2009): Twittering considered harmful. Ranging from musculoskeletal problems caused by repetitive motion to the more sinister Going-About-One's-Life-While-Distracted-By-Twittering resulting in accidents and serious injury. That, and psychological issues relating to addiction and withdrawal from the physical world.

Tuesday, June 30, 2009

Prototyping books

Books are encapsulated (and structured) written expression. Just like a formal speech is encapsulated spoken expression, a musical performance is encapsulated musical expression and so on. All forms of encapsulated expression involve the generation of ideas, experimentation with various forms of expression, arrangement of pieces of expression into some linear or spatial order, reorganization of elements as the encapsulation takes shape, and so on. We recognize all these as various steps in a process of prototyping. Prototyping is needed because all encapsulated expression (except for extempore or improvisation) in intended to assume a definite final form which is then frozen for future performances. Unlike theater, recorded encapsulated expression such as books and music don't involve the original artists themselves.

Tools for book prototyping have been around for hundreds of years but only in the past few decades have they begin transforming radically. The earliest tools were palm leaf, parchment, tree bark, or some such surface and some kind of stick or a quill along with ink. Typically, the 'artists' themselves wrote out the work and this was then made available to readers. Later, the role of 'copying writers' emerged whose task was to merely make exact copies of the original through writing. Gutenberg's press changed that process and the 'copying writer' was replaced by a 'typesetter' who laid out the type after which any number of copies could be made. The artist/writer created the original manuscript on paper, the typesetter employed the manuscript to set type, and the printer made any number of copies required.

The advent of the desktop computer and laser printer made it possible for anybody to be the writer, 'type setter' as well the printer all rolled into one.

Today book publishing has become a huge business, but it has also created some dilemmas. Book publishing works on the principle of economies of scale. Publishers want to ensure that there will be sufficient demand for a certain work before proceeding to publish it. Consequently, publishing is a guessing game, and many potentially popular works are rejected, while several duds are published and ignored by the market. Publishing becomes a game of percentages. Books are published in batches and fresh batches are printed only if publishers forecast a sufficient demand.
Laser printers are fine for printing a few copies to share with friends and family, while book publishers require estimates of many thousand - hopefully tens of thousands - before agreeing ot to publish a book. Some once popular books go out of print and then become difficult to access.

A solution is needed to fill the middle ground and books on demand may just be the disruptive innovation to do that. Print on demand makes it nearly as economical to print ten copies of a book as one hundred since the overheads are low. The technology also ensures that (assuming copyright issues are sorted out) no book will ever go out of print. Additional benefits emerge - it will no longer be necessary to warehouse any but the most popular and fast-moving titles; books with low demand will always be available if the customer is willing to pay a little more and wait a little longer. And people looking for rare works usually are.

Writers are free now to actually prototype books: they may write one and print out a small number of copies and based on demand and feedback, update the book and print many more. Of course, this is already possible with electronic publishing, especially on the web. And Project Gutenberg is making many classics that have entered the public domain available on the internet. Further, book readers such as Amazon's Kindle are trying to make the physical book obsolete. But it will be a while before physical books fall out of favor; while the demand for physical books may reduce, they will continue to exist because they are more durable than electronic devices, don't need batteries and one never need worry about changes in data storage formats and incompatibilities.

The Espresso Book Machine discussed in this article in the Boston Globe is pioneering printing books on demand at the Northshire bookstore.
When the machine is connected to an expanded online catalog of titles later this year, Morrow said, the bookstore will be able to offer customers an “ATM for books’’ that will provide access to millions of works.

“The idea is that soon we’ll be able to print out any book that’s ever been printed,’’ he said. “That could really change people’s image of the small bookstore.’’
In its first year, Northshire’s book machine printed dozens of original books by customers, including memoirs, autobiographies, poetry collections, and cookbooks, usually producing from 30 to 50 copies of each. The bookstore also published a young adult novel written by a local 12-year-old and a previously out-of-print guide to Manchester.

Self-publishers pay a $49 setup fee and a per-page rate that ranges from 5 to 9 cents, depending on the length. Northshire provides an a la carte menu of editorial and design services from a network of providers. Copy editing costs 1 cent per word; book design services, $40 an hour.
Rodefeld, a former graphic designer who works at a tiny desk next to the Espresso machine, produces up to 35 books a day. “It’s exciting to see an author’s face when I hand them the first book off the press,’’ she said. “To see the dream, the fantasy, become a reality - that really tickles me. I get to be Santa Claus all the time here.’’
The numbers at Northshire Bookstore, Morrow said, are “on the cusp’’ of working out. The big payoff will come, he said, when the Espresso machine is seamlessly connected to the entire universe of books, allowing the store to fulfill any request in minutes.

MIT Eureka Fest 2009

High school student projects. [Thanks to CrunchGear.]

"You may use your class notes and Feynman"

Great story! Probably apocryphal, but the kind of story I want to believe because I love the man so much.


Posted July 30, 2001 06:15 | Category: Story | #

Since Caltech has an honor system most exams tend to be take home and open book. The instructor for the class will write any special directions at the top of the exam. For a freshman physics exam one year the instructions read:

You have 3 hours.
You may use your class notes and Feynman.

"Feynman" of course referred the Feynman physics lecture notes which are published in three volumes.

On reading these instructions one particularly alert student grabbed his exam and raced across campus to Richard Feynman's office. He handed the exam to Feynman explaining that the directions clearly indicated he was a valid resource. Feynman glanced over the instructions and agreed. He wrote out the exam in less than half an hour, and got a perfect score!

Posted to by (John Daly) On 3/8/92

Monday, June 29, 2009

Feynman: The inconceivable nature of Nature

I gotta stop this, else I'll end up posting every dang video of Feynman here. What an incredible teacher! He breathed such life and color and intensity into every little idea about the universe. He must have received a standing ovation at the end of each of his lectures.

Feynman's gotta have his orange juice

What a riot!

Sunday, June 28, 2009

Richard Feynman on science and aesthetics

Richard Feynman is my favorite scientist. Scratch that -- he's my favorite thinker, period. Delivered in his inimitable style and East Coast accent.

Green Box - innovative pizza delivery box

With a nod to Daniel Pink where I found this video. Very, very neat -- especially the leftover box at the end. Just when you thought there no more innovation possible in this most minimal of designs comes this redesign which requires no additional material or even a significant change in the manufacturing process. And yet avoids using additional material to wrap any leftovers. Leftover box takes little room in a refrigerator. This is pure genius.

This one's for you, Michael ...

To be honest, I never was one of your fans. And yes, over the last decade or two, whatever fame and adulation you may have garnered, was overshadowed -- with considerable support from a rapacious media -- by your eccentricities, of you which you had more than your fair share.

But then, I was not among legions - hundreds of millions, likely, perhaps even a billion or more, it appears now - of kids who had grown up listening to your music and who never gave up trying to emulate your style, especially your dancing. Yes, you were a god up in the sky to them, but you also made it possible for them to dream that they could dance like you some day. And man, you knew to dance, you knew to do things with your body that other grown ups couldn't even imagine doing; but not the kids, their minds, their bodies, were pliable, and the beat, and the music, it was so infectious, so contagious, they just had to go out and try doing it themselves. And they were happy to just try and to feel that they had managed to accomplish at least some part of what you did so well.

You delighted, you entertained, you, at least for a brief moment, uplifted the spirits of millions of people, young and old, not just in your town, not just in the country of your birth, not even just people of your race or color. You cut across all barriers, Michael, and when you sang, We are the world, it was utterly believable, for you had the ability to captivate the hearts of people from every culture and from every social stratum and from every generation.

You were an original, Michael. You took the seemingly ordinary, polished it, perfected it, enhanced it, transformed it, infused it with life, with power, with intensity, with passion, and yet made it all look so easy. You melded music, movement and drama into one seamless, inseparable spectacle. All the kids wanted to be you, be Michael Jackson, the greatest entertainer yet to walk upon the earth. Your influence stretches across the planet, and entertainers in every land owe at least a small debt to you. The film industry in India has constructed an entire genre of song and dance derived directly from your innovations.

Like many original and innovative persons who have graced this planet, your life was short, and tragic. Why is it almost a law of nature that those who give of themselves the most must also suffer the most?

But it's over now, Michael, you will suffer no more. The media will no longer mock, taunt and haunt you. The hacks will return to their sordid lives, but you will remain forever in the hearts of the masses you delighted. And generations from now, they will speak of the man who brought so much joy to life. They will still be doing the moonwalk.

Thank you, Michael, rest in peace. Know that you will be loved forever.

Friday, June 26, 2009

From left field: the unpredictable impacts of innovations. And of deaths.

Every now and then a monumental event occurs, or a seemingly innocuous innovation enters human society which then dramatically alters its configuration, power structure, processes, communication and a whole lot else. It's hard to tell after the passage of many years how dramatic the impact was, but we live in a time when many such events occur.

The impact of the world wide web has not only been well-researched, it has been experienced all over the world. The web caused a very rapid, nearly discontinuous change in the way the peoples of the world generate exchange and absorb information. Many predictions about the world made before 1990 are practically worthless -- or change happened much, much earlier than may have been anticipated.

The digitization of music followed by the infrastructure to stream it over the web electronically without the need for permanent recording media like tapes, vinyl records or CDs has rendered the music publishing industry in its current form virtually superfluous. And the same impact is beginning to be felt in the book publishing business -- ironically, after, rather than before the music business, although text and graphics were available on the net much before music was.

And now, the Apple iPhone. According to one report, within a week of the introduction of the iPhone GS which is capable of recording video, there has been an incredible 400% surge in YouTube video uploads. Why? The iPhone makes it trivial not only to record, but also to edit and directly upload video from the phone, eliminating the intermediate step of transferring video to a computer, editing it there, and uploading it.

Again and again we see that one of the most common ways in which innovations transforming existing structures and processes is by eliminating intermediaries or intermediate steps. Telephones, email, personal vehicles, personal computers, TV ...

Forecasting is a chancy game in these times; most forecasters end up looking foolish, eventually. Who knew how popular was Michael Jackson? His untimely death is almost bringing down the internet with people communicating their grief, sharing stories and songs and celebrating his life.

Thursday, June 25, 2009

Innovate like Microsoft (Rip off the other guy)

You'd think success, even middle age, or Total Market Dominance -- or something -- would transform Microsoft. No such luck. The company that ripped off DOS (CP/M), Windows (Mac), Powerpoint (Persuasion), Internet Explorer (Mosaic), NT (VMS), and an almost endless list of technologies now rips off a small travel site called Kayak via its 'new/improved' Bing search engine. Take a look at the picture above and decide for yourself. And read the story too.

On the benefits of being 'scatterbrained'

More than two decades ago when I was in graduate school, pursuing a PhD, I nervously carried a draft proposal of a topic that really excited me to a faculty member whom I looked upon as a potential dissertation advisor. A European with a reasonable command of English she was reputed to be sharp, cold, curt, and fastidious. She was all that and more. She spent no more than about ten minutes with me (she was extremely organized) and during that span, she browsed through my apology for a topic analysis, marked it all over in red ink, and left deep long gashes in it. With each stroke of her pen, my enthusiasm dropped several feet, and by the time I left it must have gone right through the ground and emerged from the other side of the earth. I don't recollect anything she scribbled on the paper other than the following words: "you are scatterbrained".

Man, those words have reverberated in my brain for more than two decades -- they hurt, and badly. Needless to add, I never pursued either the subject or the faculty member any further. She went on to become a highly reputed researcher in the field and is now a member of various important international bodies and a consultant to a number of large corporations. She also divorced her husband at that time. Yeah, snarky, but I had to get that in. And honestly, while she has published enough material to fill a large truck, there is not one thing there that sets one's heart racing. It is dull, boring stuff, bordering on the obvious. Meticulous, methodical, rigorous ... all that stuff. Somebody's got to do it, for the benefit of science, I guess, and she did it. Good for her, and for science. I'm not sure if anybody would ever want to read her workmanlike writing again; certainly not me.

Me, I checked out other faculty in the department, eventually quit, and got my PhD under the most wonderful advisor that anybody could hope for, at another university. And I had the time of my life researching what I loved in the manner I wished. I remained (and to this day) a scatterbrain, which quality turned out to be an asset in the field I eventually settled on.

Now, in an article titled, A wandering mind heads straight towards insight the esteemed Wall Street Journal waxes eloquent on the benefits of being what that august professor deemed 'scatterbrained'. Referring to major breakthroughs such as that of Archimedes, the article says,
These sudden insights, they found, are the culmination of an intense and complex series of brain states that require more neural resources than methodical reasoning. People who solve problems through insight generate different patterns of brain waves than those who solve problems analytically. "Your brain is really working quite hard before this moment of insight," says psychologist Mark Wheeler at the University of Pittsburgh. "There is a lot going on behind the scenes."
So, Professor Methodical had her own way, and I had mine, and never the twain would meet.
In fact, our brain may be most actively engaged when our mind is wandering and we've actually lost track of our thoughts, a new brain-scanning study suggests. "Solving a problem with insight is fundamentally different from solving a problem analytically," Dr. Kounios says. "There really are different brain mechanisms involved."
That was me.
By most measures, we spend about a third of our time daydreaming, yet our brain is unusually active during these seemingly idle moments. Left to its own devices, our brain activates several areas associated with complex problem solving, which researchers had previously assumed were dormant during daydreams. Moreover, it appears to be the only time these areas work in unison.

"People assumed that when your mind wandered it was empty," says cognitive neuroscientist Kalina Christoff at the University of British Columbia in Vancouver, who reported the findings last month in the Proceedings of the National Academy of Sciences. As measured by brain activity, however, "mind wandering is a much more active state than we ever imagined, much more active than during reasoning with a complex problem."
So here's my advice to you, gentle reader. Go out, daydream your heart out. You're allowed to daydream at least one-third of the time, anyway, as per the article. Daydreaming is good for you, and for society. Daydreaming could lead to stunning breakthroughs that could improve mankind's lot. But even if it didn't, someone engaged in daydreaming is not committing crimes, driving dangerously, or causing any kind of harm to the world. Now there's a two-for-one deal: society wins even if nothing comes out of your daydreaming. And you've had a great time too!

I'm thinking of launching a non-profit organization called Society for the Promotion of Universal Daydreaming with a large potato for a corporate logo (and mascot), symbolizing the legion of daydreaming couch potatoes that have made the world a better place. ;-)

Pizza Hut Minus Pizza = The Hut. And Pizza. Hunh?

Apparently, people -- if you can call those over 35 that -- don't want to eat pizza anymore. At least, not the kind of junk served at Pizza Hut. This has the corporation's mandarins worried. They sat around a swank table and exclaimed, 'Holy crap, they don't want to eat junk anymore?! I wonder why?!' So they went and asked them (the 35-and-up geezers). And they said (and I quote):

one of the big things that would reignite their passion with the category is to have a pizza made with multigrain crust and an all natural tomato sauce...
All natural tomato sauce! Now, who'da thunk! After all the decades and billions of dollars we have spent convincing people that synthetic crap is good for them, those ingrates want to eat healthy, natural stuff! Oh, the nerve!

After the dust settled, the Chief Pizza Officer and his Condimental Lackeys decided to serve customers stuff that actually grows in the ground. Just to play it safe, they decided to change the corporation's name from 'Pizza Hut' to just 'The Hut'! Ain't that cool and all?! They did this, because ... wait for this ...
... that ties in nicely with (today's) texting generation.
Oh, yeah, it does! You see texting limits you to 140 characters, and we were able to knock off a whole 2 (two) characters from the name! We are so ingenious! Now, whenever people come across the 'vocabulary word' (which is what we call it) 'hut' they will immediately associate it with pizzas! Indeed, as they travel around, especially in the developing world, they will come across many huts, and just looking at them will immediately generate a craving for pizza, and they'll rush back home immediately for their favorite pie!

And then look at our real game changer ... a red-colored box! Now, that's a first in the pizza business! No more dull, brown boxes, but a brilliant lip-smacking red one, to get the gastric juices flowing.

Crap by any other name remains crap. Taste and smell are the most powerful senses that are hard to influence through the mind. There is innovation, and then there is stupid stuff like this. Pathetic. Some marketing consultants must be laughing all the way to the ATM.

Wednesday, June 24, 2009

The Idea of Sam Pitroda

Twenty-five years ago, before we had Azim Premji and Narayanamurthy to inspire us; before Abdul Kalam fired our imaginations and became a household name; there was Sam Pitroda to provide leadership in advanced technologies to an emergent India. Pitroda grew up in a humble Gujarati family in Orissa and after moving to the US, made it big as an telecom entrepreneur, eventually becoming a US citizen. He was invited by then newly anointed Prime Minister Rajiv Gandhi to help navigate India into the 21st century on a technology platform. Pitroda was reputed to have turned in his US citizenship in order to serve as an advisor to Rajiv Gandhi.

In the end, the Idea of Sam Pitroda was more successful and enduring than any initiatives he got going. He served as a beacon of inspiration to a whole generation of Indian engineers, many of whom have gone on to set up their own projects. And indirectly, he likely revolutionized telecommunications in India.

Eventually, Rajiv Gandhi -- a technology champion who was India's first Prime Minister to visibly use a desktop computer in his office -- was voted out of office, and Pitroda lost his key champion in the Indian government. A disappointed Pitroda returned to the US, where he continues to be based, but his desire to help transform India through technology has not lost any of its intensity.

Flipping through banal TV programs, I managed to catch snatches of an interview with him on NDTV Profit. In that part of the interview, Pitroda was asked to address the problem of government money and resources meant for villages rarely reaching them because of the manifold layers of middlemen who took their cut, leaving next to nothing for the intended recipients. He responded that technology could eliminate all the layers of middlemen and ensure that the losses and inefficiencies of transfer were minimal.

I agree with him on this: as various governmental operations get computerized, paper files get eliminated, and with that, the tendency of said files to gather dust, lose documents, or vanish completely -- unless the various public intermediaries are propitiated with monetary benefits. I am pleasantly surprised at some of dramatic changes in the past 25 years.

But let's address another, more structural problem: why did we end up with so many layers in the first place? It has been remarked that India lives in her villages and that is true in many ways including in an especially important manner: India has long had a decentralized society and culture made up of autonomous self-governing villages and cultures rooted in local geography and history. Over the millenia, kingdoms and empires have come and gone, but these powers have been only loosely coupled to the fate of the villages, where life proceeded quite independently of the transient powers that would stop by to collect tribute. The idea of centralization: be it of culture (and religion) or of society is an alien, western one. Centralized societies demand a homogeneity of belief and practice that is unsuited to India's diversity. Centralization has its advantages -- it is centralization that permits the creation of large organizations and even empires, such as the British Empire, which eventually came to rule over a significant fraction of the world. But when centralized empires collapse, it leads to chaos as new leaders emerge. Autonomous villages are limited in size and power, but if any one village (or the reigning regional empire) collapses, there are few shocks, if any, to the social system as a whole. This is how the culture of India has survived with little change for many centuries. Loosely-coupled systems and societies are stable and long-lasting, and permit a degree of diversity that cannot be imagined in large, monolithic societies and systems.

When the British took over India, they imposed their centralized, monolithic organization and processes on a diverse, loosely coupled society. This has never worked well. After Independence, India was bitten by the Socialism bug, which, since it came from the West, was cut from the same cloth; it emphasized centralization and elimination of diversity. The many layers of bureaucracy are the result of having to create a centralized administrative structure for this vast land.

First, let's kick out the virus of socialism; apart from a few ideas that could easily have been derived from humanism, socialism has done far more harm than good. The additional baggage of centralization that came with socialism has done even more harm. There needs to be centralization of law and order enforcement -- we need to have uniform laws and rights for all, in theory as well as practice. And we also need to ensure that Indian citizens are permitted to move and settle freely throughout the land without fear or favor. Beyond this, administration needs to devolve to local units. The long arm of the Central Government needs to be shrunk by several orders of magnitude. We need more autonomy throughout India to reflect a diversity of culture and heritage that has withstood the test of time.

So I agree with Pitroda: we can and should use more technology to reduce corruption, but we also need to dismantle, redesign and reconstruct the adminstrative policies and processes in the country.

F100 CEOs don't Tweet (but do they Rock 'n Roll)

Who would you vote to lead the corporations of the present into the future: suits who network with their peers at a country club, nursing a glass of Scotch, or folks in jeans (or even or suit, maybe?) connecting across the globe through social media (Facebook, Twitter, Blogs, wikis, etc.)? Okay, my bias is showing here, but so what? And granted, it takes a whole lot more than being social media savvy to run a company -- today. Jonathan Schwartz, CEO of Sun Microsystems, has had a blog for years, dang it, and yet couldn't do better than sell out to Oracle. Then again, maybe he was so savvy, he sold out, and that was the best thing anyone could have done in the prevailing circumstances.

But consider this: today's corporation needs to keep hiring, especially at the entry level; a level made up mostly of young people, who tend to be social media savvy. These are the people who will eventually end up at the uppermost rungs of the corporation and run it. And smart CEOs will in fact go out of the way to hire young people who get New Media. For it's increasingly clear that the future of business -- and society (and even nations: check out all the Tweets coming out of Iran) -- will rest increasingly on the ability to tap into and take advantage of social media. Anyway, that's what I believe, and I might try to justify this in another post, or just point the reader to people like Clay Shirky and Chris Anderson and Siva Vaidyanathan who've likely already done so (see Kevin Kelly).

So it's interesting to learn from UberCEO that the CEO's of Fortune 100 companies don't get social media. Or maybe the get it, but aren't social media savvy. Or perhaps they're savvy but are waiting for a strategic moment to make their entry. Or are so smart that they know that using it really doesn't serve much of a purpose in their businesses.

Hmm ... I doubt that they are that smart, or else one of the traditional guys -- Barnes and Noble or Borders -- would have started something like Amazon (which is now eating their lunch). From the report:
  • Only two CEOs have Twitter accounts.
  • 13 CEOs have LinkedIn profiles, and of those only three have more than 10 connections.
  • 81% of CEOs don't have a personal Facebook page.
  • Three quarters of the CEOs have some kind of Wikipedia entry, but nearly a third of those have limited or outdated information.
  • Not one Fortune 100 CEO has a blog.
I can understand them not having a Twitter account -- even I'm still flailing about doing my best to get it (ain't givin' up 'til I do). Not that I'm a barometer for this sort of thing, but still. But no blog? Dude, blogs are so 1999, and you still ain't there yet? I'd have expected them to have at least hired a 20-year old to do social media on their behalf. I guess they couldn't find one who got it and was okay with wearing a suit too.
Wikipedia had the highest level of engagement among the Fortune 100 CEOs, yet 28% of those entries had incorrect titles, missing information or lacked sources.
No excuses, babe. Even an old-time marketer will tell you that if you don't actively manage your public image, you'd better accept whatever comes up out there.

Given that F100 CEOs seem to be Old School Tie types, it's rather telling that there are more of them on Facebook (which is swarming with kids) than on LinkedIn, a professional networking service -- so are the CEOs stalking kids on Facebook? Man, that ain't even funny.

Now Facebook is about networking, but that doesn't necessarily mean a place where you post pictures of last night's drunken revelries. Stanford now is experimenting with holding office hours on Facebook, goshdarn it! First guy out was Prof. Phil Zimbardo, he of the infamous prison experiment. Not exactly a spring chicken, but certainly an out-of-the-box thinker. So it's not about one's age; it's an attitude thing. The New England Journal of Medicine has a presence on Facebook too. If these two very traditional institutions get it, there's no reason why F100 CEOs shouldn't be out there.

There are CXOs who Twitter, but they're pretty far removed from the F100 bubble. Here's a (pretty long) list. More than likely, they're relatively young, and are into tech, or are tech-savvy (e.g, @werner - Werner Vogels, CTO of Amazon and @vivek -- Vivek Ranadive, Founder & CEO of TIBCO). And then there are the thought leaders of the new Techno-Business-Cultural Zeitgeist, people like: Clay Shirky, Chris Anderson, Tim O'Reilly, Dave Winer, Siva Vaidyanathan, Malcolm Gladwell, Al Gore, Lawrence Lessig -- people of considerable influence especially among those under 30. Or even 40. Despite most of them being over 50. They get it.

Here are more detailed data from the study:

Whaddaya think?

Pre:iPhone::Mac:PC - It's like old times all over again

Reports coming in from across the web suggest that Palm's Pre smartphone is an outstanding device, and in many ways superior (slider keyboard, wireless charging, camera flash, background applications, web integration, removable battery, and slick, gesture-based user interface) to Apple's iPhone. While Apple's newly introduced iPhone 3GS ups the ante, there are sufficient reasons for people to stick with the iPhone. Back in the day, Apple's Macintosh, introduced in 1984, attracted a loyal following but the bulk of the market was owned by far inferior IBM/Microsoft PC, mainly for two reason: the lower price, and the availability of many applications.

So while the Pre is a beautiful device that can hold its own against the iPhone, it has thus far been able to garner only a few dozen applications, in comparison to Apple's 50,000 and counting. That difference is going to sway the prospective smartphone customer towards the iPhone, however much she may like the Pre, and so the iPhone is definitely going to have more numbers -- and hence greater profits for the application developer -- than the Pre.

Apple learned its lesson -- it's retained its design cachet, and it now has the application portfolio: an unbeatable combination. All hail thee, St. Steve!

Price, Aesthetics, Functionality

Back in the paleolithic age of personal computing, Apple was among the first out the gate in the mid-1970's, and then established a strong presence through its open hardware architecture (which was mostly a product of founder Steve Wozniak's philosophy). Then in 1981, IBM came in like a tsunami and swept away the PC market from Apple. IBM's message was: we're the Big Serious Computer Company, not a bunch of bearded hippies like the other guys. Apple came back with the paradigm busting Macintosh and quickly built up a fanatical following. Apple was big on aesthetics and usability. The IBM-Microsoft product was a sloppy apology, but it had the right combination of lower price and a vast ocean of applications that the Mac could only dream of. Price and functionality won over design aesthetics and usability. In 1996 before he returned to Apple, Steve Jobs declared,
If I were running Apple, I would milk the Macintosh for all it’s worth–and get busy on the next great thing. The PC wars are over. Done. Microsoft won a long time ago.
What nobody realized at the time was that Jobs was plotting his revenge -- and that he had learned his lessons well from the Mac-PC wars. Fast forward a decade to 2007, and Apple introduces the iPhone. Well known industry expert John Dvorak declared it dead in the water. Aesthetically, the iPhone was the Mac's spiritual successor -- and indeed, in its refinement, outshone it in every department. There was feeling going around, however, that it would suffer the same fate as the Mac -- a fine device worthy of being an art exhibit, but overpriced and likely an also ran in a crowded cellphone market. It didn't escape Jobs' sharp intellect that applications were what made the PC a rip-roaring success. At the same time, his high personal standards refused to permit him to put out an aesthetically inferior product. As he observed, of products coming out of Apple before his celebrated return in 1997,
"The products suck! There's no sex in them anymore!"
-- On Gil Amelio's lackluster reign, in BusinessWeek, July 1997
Hence, rather than compromise, Apple came out with an elegant product and aggressively promoted a market for iPhone applications -- which now number over 50,000. When the 3rd generation iPhone 3GS debuted in stores last weekend, over one million sets were sold.

There is a lesson here for the marketing of functional digital products. There appear to be three important dimensions of a product or service: Price, Aesthetics, and Functionality. The IBM PC was scored low on aesthetics but its scores for Price and Functionality were high. It won over the Mac's high aesthetics but low score on Price and range of apps (Functionality). The iPhone is scores moderately on Price (neither too high, nor low), but scores very highly on Aesthetics and Functionality (Apps). No wonder, then, that it's beating the pants off the competition; it's like buying a high-end PC and getting a Mac for free. Who wouldn't go for it?

Tuesday, June 23, 2009

Innovation and evolution: winning the battle, losing the war

Life itself is a chancy thing, but like love, the pursuit of innovation, is among the most fickle and heartbreaking means to spend one's time on earth. And just as nobody ought to fall in love for any reason but to love, it is wise never to consider innovating unless the process of innovation is itself intrinsically appealing and satisfying to the heart. Whatever comes out of the process ought to be treated as a bonus.

Millions of organisms, many, beautiful to behold have become extinct over the eons, and thousands more face extinction everyday. They have reached an evolutionary dead-end, not because of any intrinsic shortcoming, but because they no longer satisfy the criterion of fitness with their environments. It is Richard Dawkins' Blind Watchmaker at work, dispassionately removing those organisms that no longer fit into the grand scheme of things, regardless of their intrinsic merit.

The same sort of phenomenon is observed with respect to innovations, except that the ecosystem in which they emerge (or in which they face extinction) is human society and not the natural world. There is a certain tragic quality to this state of affairs: On the one hand, it is in human nature to try to perfect any artifact that emerges out of the imagination, and in the case of complex artifacts such as advanced technologies, such perfection requires intense, repetitive effort, and many iterations and years before any level of perfection is achieved. On the other hand, the ecosystem is an unfeeling context which cares little for human aspirations regarding the perfection of innovations. At any moment, an artifact or technology can be rendered unfit (in an evolutionary sense) because of changes in the environment, especially the emergence of competing technologies; and however wondrous and intricate it may be, further development ceases abruptly, and it is left to be mourned only by its sometimes resentful inventors, but is cruelly forgotten, or even ridiculed, by the masses. Not long after, it begins to appear quaint, archaic, obsolete, something that would never have had a chance to survive, anyway.

Charles Babbage's complex Difference Engine, a mechanical calculator made up of thousands of gears that he designed in 1822, is one such example. Machines constructed from his designs are beautiful to behold and demand considerable, precision effort. But they are far more expensive to build and far less capable and accurate than any modern, digital, hand-held calculator. The mechanical computers were great innovations, but in retrospect would never have been able to scale up. A very similar example is that of beautiful, expensive, high precision mechanical watches, and their far less expensive, much more mundane, but nevertheless far more accurate digital successors. Mechanical winding watches have been relegated (or perhaps, elevated) to the status of expensive fashion accessories, and even jewellery whose primary function is that of an adornment and an object of wonder rather than a device that tells the time.

Over and over again, Mother Nature seems to have been right, but her correctness is perceivable only in retrospect, and often only after an interminably long period.

I came across two other instance of Cruel But Always Correct Nature: the American Apollo Space Program and Blu-Ray optical disks. I was surprised to see the first described as some sort of a failure -- an evolutionary dead end -- and to see the latter as obsolete already; but upon reading through the articles, I understand and agree with the sentiments expressed. The US space program launched in right earnest, thus:
America had been inspired by President Kennedy's wish, announced in 1961, of "achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to Earth." After his assassination in 1963, the idea became a homage to him, a way of showing the world what the United States would have achieved had he lived. Within days of the Apollo 11 astronauts' safe return to Earth, someone put a message on Kennedy's grave: "Mr President, the Eagle has landed." Job done, in other words.
By 1969, the task was done, and the last moon mission occurred in 1972 - Apollo 17; the next three missions were cancelled, and that was that. The space mission was launched not so much to land a man on the moon as to tell the world that the US could beat the USSR at its own game. And it did -- at immense financial and personal cost:
It was also an extraordinarily expensive project, it should be noted. The entire Apollo programme cost $24bn in 1960s money - around $1 trillion in today's - and for several years was swallowing up almost 5 per cent of the US federal budget. In addition, there was also a considerable emotional cost to the missions, a point stressed by Christopher Riley, co-producer of the 2007 documentary In the Shadow of the Moon. "A great many Americans suffered premature heart attacks and strokes from their efforts in making the Apollo project succeed. More than 400,000 workers were employed by private contractors to build capsules, rocket engines, space suits, and computers for Apollo and the vast majority worked flat out, over weekends and holidays, much of the time for free, for several years to make sure the programme succeeded."

For example, at the Grumman factory in New Jersey, where the lunar module was built, staff would clock off at 5pm, leave by the front door, walk round to the back and work for free until midnight. Similarly, employees at the International Latex Corporation - which made the suits worn by the Apollo astronauts - worked with equally obsessive intensity. In a recent documentary, the company's senior seamstress, Eleanor Foraker, recalled working 80-hour weeks without days off or holidays for three continuous years, suffering two nervous breakdowns in the process. "I would leave the plant at five o'clock in the morning and be back by seven. But it was worth it, it really was."
Looking back, it appears that the Apollo mission was destined to be an evolutionary dead-end. It achieved its principal purpose, and then it became clear it was fit for little else:
In the end, the real problem for Nasa is that it did the hardest thing first. Kennedy's pledge to fly to the moon within a decade was made when its astronauts had clocked up exactly 20 minutes' experience of manned spaceflight. "We wondered what the heck he was talking about," recalls Nasa flight director Gene Kranz. To get there before the Russians the agency was obliged to design craft that were highly specific to the task. Hence the Saturn V, the Apollo capsule and the lunar module. Unfortunately, these vehicles were fairly useless at anything else in space - such as building a space station - and Nasa, having nearly broken the bank with Apollo, had to start again on budgets that dwindled dramatically as the decades passed.
The article concludes:
The conclusion is therefore inescapable. Kennedy's great vision and Armstrong's lunar footsteps killed off deep-space manned missions for 40 years - and probably for many decades to come. As DeGroot says: "Hubris took America to the moon, a barren, soulless place where humans do not belong and cannot flourish. If the voyage has had any positive benefit at all, it has reminded us that everything good resides here on Earth."
The other example here is the Blu-Ray optical disc which was created by an industry consortium as a successor to the ubiquitous DVD disc. As compared to the DVD's 4.5 GB capacity, the Blu-Ray has a far higher capacity of 25 and 50 GB. The Blu-Ray competed for market space with Toshiba's HD-DVD and won the format wars. But not for long, it would appear. While scientists were focused on a successor to the DVD (which in turn succeeded the Compact Disc), the ecosystem changed around them and direct digital downloads over the internet are growing in popularity along with increasing bandwidth connections. The Blu-Ray has won the battle, but it may eventually lose the war. It's not clear if Blu-Ray's developers will ever recoup their developmental expenses. Blu-Ray players are priced high to recoup the cost, but this very situation militates against their rapid market penetration, especially during a recession. From the article:
Blu-ray will doubtlessly continue to grow in popularity as more of us buy large HD-capable flat screen televisions. In the same vein, it’ll continue to make inroads in computing and video gaming markets. But it’s a case of too little, too late, as long-term trends point to a slower uptake than DVDs ever had. When we can simply download a good-enough copy of a movie from iTunes and save it to a USB drive or mobile device for viewing pretty much anywhere, why would we even bother with a power-hungry, noisy, expensive and frankly inconvenient disc in the first place?

By the time most consumers have asked themselves this question, the answer will already be in: Optical discs are a fading technology, and investing in them now could be a shorter-term move than you might have initially anticipated.
Your typically management tome might ask 'managers' to perform some sort of 'strategic analysis of the market/technology landscape' by employing a sainted analytical tool which gives cute names to different cells in a matrix. The reality is that no management theory can beat large-scale paradigm shifts, or even demands of the moment. Oftentimes, you do what you are able to do, and hope the ecosystem will continue to support you, even while placing a few bets on the side on other, alternatives whose prospects currently seem remote.