Thursday, July 09, 2009

Wednesday, July 08, 2009

Tuesday, July 07, 2009

Sunday, July 05, 2009

When Disney Met Dali

From the text at Monsters And Rockets:

In 1946 legendary surrealist Salvador Dali formed an unlikely friendship with Walt Disney, and they spent some time collaborating on a short film called Destino. Dali and Disney artist John Hench worked on a lot of storyboards, but only 18 seconds of test footage were shot before the project was abandoned.

In 1999, Disney's nephew Roy Edward Disney was working on Fantasia 2000 and he decided to complete the Destino project, over 50 years after production began. 25 Disney artists worked from the original storyboards (with some input from Hench himself, and notes from the journals of Dali's widow) and finally completed Destino using a mix of hand-drawn and computer animation. The 18 seconds of test footage were included, in the shots of the two weird, turtle-like creatures seen above.

Destino didn't end up as part of Fantasia 2000 and hasn't been widely screened. It was seen in theaters with the films Calendar Girls and Triplets of Belleville, but so far it hasn't been released on DVD. It's amazing to look at, but I have the feeling that the imagery in Dali's own version would have been a bit more disturbing. (Notice how those turtle monsters kind of stand out from everything around them?) It's also a little funny how the Disney artists just can't resist making the dancing girl into a Disney princess. There are a few shots in here that look a bit like Belle in Dali Land.

Who's on foist?

A timeless classic that entered the popular lexicon ...

Saturday, July 04, 2009

The State is NOT The Community

Over and over again, I see writers -- who ought to know better -- being unable to distinguish between Communism and Community. Communism is where the State arrogates all power to itself and in fact, does everything to destroy community. The Communist State would like to make like to make every citizen completely dependent on it for providing everything. The State decides what your duties and responsibilities are and also provides for all your needs. In theory. Community, on the other, is a phenomenon that emerges through natural processes from the grassroots, when a group of people find themselves in a situation where they are faced with a common fate. Community is not imposed from on high, unlike a Communist State.

Open Source and Social Media represent and encourage community-oriented phenomena, not Communism, as their detractors often allege. Both the Totalitarian State and the Powerful Corporation are inimical to the welfare of the Community. Communities should strive to make the State and the Corporation subserve, rather than rule over them.

THIS, is the REAL first iPhone 3GS music video

... or so they say ...

Thursday, July 02, 2009

The thrill is gone, baby ...

Is it over, already? Man, I don't believe it ... maybe I'm just burned out a little, need to take a break from Twitter. Yeah, there've been a few good links to look up, and some nice, pithy sayings that have been keepers. I've linked up with some interesting people, and had a few Twitterversations. Still, I feel a bit jaded. Not sure why.

Though I'm 'following' nearly 190 persons, it feels as though most of the posts come from a handful; almost certainly, not all 190 are posting, and even among those who are, it feels (that's my impression) as though no more than about 20 are doing so frequently.

I guess it feels like I'm meeting the same ol' folks saying the same ol' kinda things that I expect them to say, given that I'm beginning to discern a pattern for each. It's interesting that one unconsciously begins to construct a certain unique persona for each individual, as a gestalt of all their tweets. The tone, the content, the syntactic structure, the tweeting frequency, and the pattern of tweeting (quick bursts of many tweets, periodic tweeting at fairly regular intervals, or random intervals) as well as the avatar image used, all contribute to the complexion of the persona that emerges from the tweetstream.

Guess I need to follow some more people, although I wonder if I can handle that.

It is a cocktail party. Or maybe it is like one of the long train journeys we used to have when I was a child, which often took up to three days to complete. Along the way, we would befriend fellow passengers, have interesting conversations, share food, assist one another (especially in keeping an eye on kids and belongings) and in general have a great time. And then we'd get off the train and never see or hear from them again. Kinda sad, but one's life was enriched anyway, and it made for a memorable journey, making one eager to embark on another one ere too long.

Yes, Twitter feels a bit like that. You overhear scattered fragments of conversations among strangers and a fair bit of it is fun. There is bustle and noise, and a sense of movement. People stream in and out. Passengers get on and off. There are moments of quietude, and then bursts of activity and sound. Some stuff is funny, other stuff is dreary, a bit of it is boring, and then there is some pretty interesting stuff.

You'll likely never come to know people completely, just the side of them they choose to reveal during the journey.

The thing is, the train keeps going on forever, even if you have to get off at some place. And you know that the train didn't start at the place you got on, it's come from a long way off, been running for a long time, and it's final destination is way beyond where you will disembark. There are a lot of people on board, and you will never get to meet them all; and some you'll never want to meet.

Maybe I need to visit some other compartments (carriages) and wander around in previously unexplored parts of the train.

Yeah, that's what I'll do. Next time. Maybe. My brain's tired from all the listening and tweeting and trying to make sense of it all. Good night.

Distermediating Media (or Media for Disintermediation)

Likely the term used most often in an unintentionally ironic manner is the word disintermediation. And in the past year or so, it has become among the top buzzwords in use, flung around freely from every pulpit everywhere, typically delivered in a booming, authoritative tone. Disintermediation is what the New Media or Social Media is all about, we are told. Disintermediation takes away the media so that there is nothing -- nothing -- that remains between you and the thing with which you choose to interact. Take this one, for instance:

Social media is a de-institutionalising and disintermediating force. It gets rid of institutionalised functions. This is the lesson from every sector it has touched. In music it has got rid of the music business (and the creation and sharing of music has flourished). In news it is getting rid of the news business (and the creation and sharing of information is flourishing). In government, logically therefore, it will get rid of the government business.
Ye-ep! Social media disintermediates! You read that right! Now how in heck does it manage that? Perhaps in the manner a dog chases its tail or a snake swallows itself?
Web-ons of Mass Disintermediation: Once the world is completely disintermediated, then we shall all be free! Liberte! Egalite! Fraternite! Revolucion! Che! Etc.!
Hey, it happened in Iran, didn't it?

A search on teh google for the term disintermediation came up with 231,000 hits [June 2, 2009; 8:09 am GMT]. Bing, on the other hand, delivered 208,000. The difference of 23,000 disintermediations seems to have gotten intermediated somewhere in the vast search space separating the two search engines. Interestingly, the Wikipedia definition came up first on both lists -- I'm figuring that Wikipedia is the most popular medium of disintermediation out there. [Disintermediate produced 55,000 hits on Google and 18,400 on Bing; disintermediating gave 61,300 and 12,400 on Google and Bing respectively. Wikipedia continued to rule.]

Google! Hey, there's a disintermediary (17,100 Google hits, but only 112 on Bing! Bing! and all the intermediaries crumble to dust), if there was one! Google eliminates everything that stands between you and the information you seek, right? Right?

A discussion of the larger problem with definitions and pronouncements is in order especially because of the currently raging controversy over the book, Free by Wired editor Chris Anderson, sparked off by a none-too-positive New Yorker review by superstar author Malcolm Gladwell and a plagiarism allegation. This controversy is being discussed all over the blogosphere (and is being disintermediated by Wikipedia even as we speak; 286,000 hits for the search string "chris anderson malcolm gladwell free").

The issue is what Anil Dash colorfully calls airport books: these are easy reading fare usually written intelligently and engagingly by brand name authors, and dealing with topics of broad current interest but written in a manner that is appropriate for cocktail party conversations. The material presented usually makes reference to scientific research and pithy, easily remembered (and quotable) conclusions are presented as if with authority; very often the language and phrases from the book quickly enter the general idiom and become part of folklore, accepted without the need for proof. Nevertheless, very often the pronouncements made are quite shallow and don't stand up to intense scrutiny. Cross-questioning by intellects of such stature as my hero Richard Feynman would likely cause the theories and pronouncement to crumble to dust instantaneousy. Indeed, the writers often take advantage of the fact that public memory is short and predictions made in the book are (to the relief of the authors) quickly forgotten, to make way for new ideas and new books (of limited shelf life) that come streaming down the airport aisles. The books, in fact, could be considered at best to be medium to high-brow entertainment, meant to tickle the mind and provoke thought and discussion. Unfortunately, especially due to the authors' reputations (and the extensive references in the books), they are often treated as having the force of real, formal scholarship, and in that sense serve only to muddy public discourse.

Back to the subject, though: I blame popular writers of airport books and similar blogs for having created this very flaky buzz around the idea of disintermediation. Apparently, New and Social Media eliminate the much reviled Middle Man thereby reducing costs, increasing transaction speed, etc. But is this really disintermediation? Let's see,
mediation: coming in the middle
media: something that comes in the middle
How in heaven can media disintermediate? The reality is that there will always be a medium of some sort whenever there are two or more parties involved in a transaction of any kind (physical or informational). Unless some means of effecting transactions is invented that instantaneously generates the required knowledge and information as well as goods and services inside a recipients brain and body respectively -- I don't see that happening anytime soon. So,
Please repeat after me: Media does not, and cannot disintermediate.
What social media are achieving is to replace one kind of medium that has outlived its purpose and now only creates inefficiencies with another sort of medium that seeks to eliminate the extant inefficiencies. It is not at all likely that the New Media will remain the Gold Standard forever -- there will come a time when they too will begin to show their age and will need to be supplanted by entirely new media. The danger of falling in love with the term disintermediation is that the label will become permanently associated with New and Social Media making their introduction and use unchallengeable. They will become the New Holy Cows and anybody challenging their use, even when the media become ineffective or inefficient, will risk inviting public scorn, and quite possibly be denied the space to publicly present their views.

So may I propose a new, more appropriate and relatively neutral term that correctly captures what's going on, i.e., the replacement of one set of media with another set of potentially more effective and efficient media? Well, ladies and gentlemen, boys and girls, here it is:
But ... but ... but ... you say, that can't be, this ain't a remedy, that term's already taken, and it means something else. How about,
Don't lilke that, eh? Disintermediation rolls off the tongue, with so much gravitas and authority, that one is loath to give it up. Consider, then,
the bene- prefix meaning "good" in Latin. Or even,
Hey, I think I've got it!
Anyway, dear reader, think up some of your own. But for heaven's sake, please reconsider your indiscriminate employment of the word disintermediation for purposes that have everything to do with mediation.

Thank you!

Wednesday, July 01, 2009

First iPhone music video shot on iPhone?

The iPhone is turning out to be a stealth device ...

One week in Cloud Twitterland

So I figured I needed to know what this Twitter thing was all about. Yeah, I had a Twitter account (@murliman)already, got one a long while ago, needed to feel I was cool and all; but after sending out a tweet or two, I couldn't really figure out what this was all about and abandoned further efforts. Here's my first ever tweet:

exploring twitter
from web
Note the date on that. My second tweet is identical, sent at the same time. Must have goofed up, I guess. My third tweet came more than 7 months later:
Wondering why the Twitter logo(?) is what it is (o_O) Are those the eyes of a kinda stoned birdie?
from web
And that was it.

So my Twitterscape lay fallow, unhonored, unsung, unploughed. Turned out that I was something like the first guy with a telephone or email: nice, but it all seemed pretty pointless. Things turned around in early June this year. I had started blogging again - like mad - the blame for which falls entirely at the feet of my former student, Ashish Sharma (Twitter handle: @ashinertia). I figured I needed to get word out about my blogposts and reasoned I could use Twitter for that; after all there were probably a billion Twitterers out there. At the time, I had a small (single-digit) number of 'followers' and at least they could come to know about my blogs thus. My tweeting resumed with this tweet:
from web
I still didn't know what Twitter was all about, but figured I'd just jump in and find out. In this quest I was aided by @marcynewman who tweeted me, saying, 'Hey, you're tweeting!' or something like that. To which I responded:
@marcynewman :-) just experimenting; was getting frustrated that I didn't know how to use this dang thing. You UberTechi, you!
from web in reply to marcynewman
Now @marcynewman is a major techie even though she claims I'm the one who made her technical and all. She is the quintessential technobabe which term I first heard used by my former colleague Robert "I'm not Bob" Minch when referring to someone we mutually knew. @marcynewman's body is covered with technology; she could walk onto a sci-fi movie set without any additional props or gear and fit right in.

For the next one week I posted 17 tweets ('updates' in Twitterish), 7 of which were a conversation between myself and @marcynewman (which could have happened through chat or direct messaging; still trying to figure out Twitter) and the remaining contained links to blogposts I had created; yes, I was blogging like crazy, compared to the previous years.

Then it all died out as abruptly as it had started.

I'm told this is a standard pattern among Twitterers; the Twitterscape is littered with millions of abandoned @names. But then another thing happened; my son was done with his examinations, and I was free. After that my tweeting began in right earnest all over again, about ten days ago, on June 22, since when I've sent out over 120 tweets; also, quite happily, I am now a certified twunkie (tip o' the hat to @durrink for that term) -- a Twitter Junkie. I think I am beginning to get it and I'll try to present Twitter according to @murliman (tat@m?).

Understanding Twitter

First of all, I think I understand the reason why so many abandon Twitter: one's level of participation in Twitterland is not a continuum of values -- either you're pretty close to being a junkie or you don't go there at all. It's pretty pointless to go there occasionally unless:
  • you're a celebrity and your fans out there are dying for a few morsels now and then
  • you like to be stimulated by random, serendipitous posts from friends or strangers
So, the occasional visitor is either mostly a Tweet generator or a Tweet consumer. Tweeting occasionally by anyone else is of interest primarily to close friends and family, and that too only if said close friends and family are Twunkies.
One might visit Twitter constantly for a limited period of time if
  • there's some major event happening, e.g., the Iran Revolution, the Mumbai Terror Attack, a conference or a ball game, and you'd like to know what's going on (or if you're in the thick of things, even send out tweets yourself)
Those who sign up for a Twitter account to find out what it's all about need to stick it out long enough to become Twunkies (if they are susceptible to the Twug - Twitter-Drug) else they mostly likely will abandon efforts. My first couple of forays were very superficial; it's only the third time around when I decided to stay the course until something magical happened. And in fact, it did. Like most addictions, I'm not entirely sure why I Tweet, but it does provide some sort of a high; and if I stay away long enough, I began experiencing some sort of Twithdrawal? - and need to go get my fix again. One article sums it up pretty well:
The Twitter Cycle: Curiosity, abandonment, addiction. Global visitors hit 37 million.
In just two years, Twitter has come a long way. And in the coming years, the social media landscape will transform wildly because of a seed called Twitter.

In some ways, Twitter was a clean sheet restart of electronic communications. Email, blogs, and Facebook had been around for a while, but they had become much too baroque, too overladen with features. They had become bulky, unwieldy, complex. Twitter was a way to return to the drawing board and start from scratch all over again.

The SMS framework in mobile phone communications seemed like a good starting point. There was email, which required computers (or smart phones) and there was SMS (for mobile phone communications). Twitter's founders thought -- how about seamlessly combining mobile phone and computer communications by employing the lowest common denominator -- SMS -- as the messaging structure?

Think about it: It takes two to SMS -- or Tweet: a sender and a receiver. Let's assume these are persons known to each other. Perhaps the tweets are perishable, standalone, and have no further value. There is no need to save or organize the tweets. On the other hand, perhaps the tweeting constitutes a conversation, and there is value in preserving them, much like email messages. If so, the only structuring mechanism needed to manage the tweets is to organize and list them in chronological order. It might also be useful to list each Twitterer's messages separately (and in chronological order). So there are three lists: the complete list of Tweets in chronological order and two lists listing only each Twitterer's Tweets (the latter two, of course, do not need to be separately maintained by can be generated dynamically from the first on demand).

Now, friends and family come to learn about this new messaging medium, and want in on the action. The total population of Twitterers grows, say to about 5. This set of five persons are all know each other, and would like to keep up with each others' tweets. It's personal, or work related, but it keeps them connected. The same structuring mechanism (chronological, and by Twitterer) suffices.

Then others begin to join the Twitterscape -- friends, and later, friends of friends, friends of friends of friends ... ad nauseum. Pretty soon, the number of Twitterers is in the hundreds - or thousands. Everybody no longer knows everybody else. Most have interest in the tweets of only a subset of the Twitternation. And so the idea of a Follower is introduced: each Twitterer chooses to follow some subset of Twitterers in the Twitternation. This select subset (unique to each Twitterer) is made up individuals known as Friends (not to be confused with Real friends in the Real world).

So where there once was a single, completely-connected cluster of Tweeters, there now are tens of millions of clusters, which in turn are connected to other clusters. Each cluster represents one individual Twitter and his/her Followers. That Twitterer, in turn, is a Follower (and hence a member) of many other clusters.

What, if any, change is needed to structure and organize Tweets (at least as viewed by a Twitterer)? No longer is it feasible or useful for any Twitterer to view the entire tsunami of Tweets generated by the entire Twitternation of tens (or hundreds) of millions: you view the Tweets of only those you follow. Following, then, is the structuring mechanism for bringing down an individual's Tweetupdates down to a manageable number (besides reducing the bandwidth consumption on the Internet by several orders of magnitude).
To manage communications with some level of sanity, each Twitterer chooses to Follow only a relatively small number of Twitterers (although some Twitters seemingly follow tens of thousands of other Twitterers -- to what end, I don't know; it's unlikely that they actually follow them all, or that all of those tens or thousands tweet regularly).

What other structuring mechanisms are available besides View by Individual Twitterer's Tweetstream and View by a Friend's Tweetstream?
Enter: Hashtags. Hashtags are an innovation from Twitterers themselves rather than by Twitter's developers. Hashtags are of the form
'searchterm' is the string one is searching for. Twitterers developed this as a means of tagging and searching for all Tweets relating to a particular issue, e.g.,
#iranelection or #pdf09
where pdf09 is the name of a recent conference. All tweets that include a specific hashtag are listed in reverse chronological order. While it is possible to search for any character string in Twitter, using a '#' prefix implies that the Twitterer deliberately intended for it to show up in a search. Typically, there is a consensus to employ a specific character string since the Twitter system does not create separate forums to deal with specific issues or events.

That last sentence is important: other social media deliberately create walls and fences for discussions to proceed and socialization to occur in finite forums with defined memberships. Twitter, on the other hand, erects no walls and intentionally permits serendipitous discoveries. Being a Twitterer is akin to wandering into a vast mall or commons, bumping into friends and strangers, chatting with some, overhearing conversations, making general pronouncements that some might hear and most might ignore. It's a like a never-ending cocktail party on a monstrous scale. There are some who successfully bring their real-world celebrity status or recognition into the Twitter. And there are others whose real-world status may be markedly different (better or worse) from the one enjoyed (or deplored) by their TwitterPersona. Take the case of this musician who earned $19,000 in just ten hours from her Tweets. Certainly better than anything she had managed in real life until then.

I, for one, am enjoying the ride so far. It tickles me to read Tweets coming in from persons of the stature of Arnold Schwarzenegger, Nandan Nilekani, Al Gore and for gosh's sake, Jack Welch! I felt Twitter had arrived the moment I found a post from Welch.

The question now is, what other ways might there be to structure the basic Twitter stream? The need for alternative structuring mechanisms is already evident in the large number of Twitter applications that have emerged during the very short life of Twitter, and more are in the pipeline. Twitter is like a vast, constantly changing terrain and Twitters soon learn that they need for making sense of this relentlessly transmogrifying space; they need maps of some kind, and the various Twitter apps help in both making sense of the Twitterscape as well as negotiate it successfully. Within a week after diving into the TwitterCloud, I felt disoriented enough to feel the need to scan the web and download tools for managing the process. I am now using two fine, highly-recommended Twitter desktop apps Tweetdeck and Seesmic Desktop, both constructed using Adobe Air, the Rich Internet App (RIA) development platform.

I can see how the TwitterStream can be the basic building block of just about any communication-based application, mostly involving humans, but not necessarily so: I can conceive of embedded digital devices 'friending' and 'following' other Twittering digital devices, or even humans exchanging Tweets with machines (the machines, of course, parse the Tweets and take appropriate actions). Hence Twitter can become a universal communication infrastructure at a level just above machine communication but low down on the hierarchy of human communication.

It helps Twitter's case that it is addictive: one blogger has found the need to publicly lament that his Twittering has left him little time to blog and that he was going to have tear himself away from Twittering for the purpose.
Like the world depicted in the movie, The Matrix, Twitter is an alternative reality, kaleidoscopic, rich, stimulating, heady, fast-paced, diverse, ever-changing, an infinite series of windows both into the real world as well as the world of the Web, which is itself, in turn, a series of windows ... It's not hard to understand how one can get sucked into this maelstrom with a much greater force than that exerted by the Web itself. There is no starting point nor is there a time and place to get off; one has to force oneself off and return to the physical world. Twitter is a world with strong connections to the physical world, but it is its own space, has its own complexion and character and is feels no less real than the one in which we eat and drink and sleep. Twitter has some of the characteristics of Massively Multiplayer Online Role Playing Games (MMORPGs), but unlike the latter, it has little or no barriers to entry.

One broader observation: I have been associated with what is now called social media but used to be known by various names such as: groupware, computer support for cooperative work, group support systems, etc., for over two decades, and have contributed to formal academic research in the field. In all these years, however, there is not one social application developed by researchers in universities and corporate labs that has found widespread acceptance among the general public. The most wildly popular social applications such as Facebook, Twitter, and Blogger were developed by young, imaginative, energetic persons with no pretensions to doing research: they built tools that they found useful to themselves, and happily, tens of millions of others found them useful too. None of these social apps came out fully developed or with any coherent theoretical model, as academicians insist on creating before they build and explore tools and applications. The apps came out completely from the unique, idisyncratic experiences of a few individuals; surprisingly, they also matched the needs of the many. Over time, with feedback from users, the tools rapidly evolved. In all these instances, theory appears to follow, rather than lead phenomena. There must be a whole lot of researchers trying to figure out why Facebook and Twitter have become the monsters they are now, but none of them could have anticipated them based on any available theoretical framework. Clearly, the current social and social psychological theories are flawed, or limited and need to be reviewed and revised.

But the situation also calls into question the value of academic research in the field designed to produce new social applications (rather than investigate the impacts of extant social applications). It is humbling to realize that there is little to show for over 25 years of formal university and corporate lab research and development in the design and introduction of social media.

I ought to emphasize that the development of core technologies such as operating systems demand the knowledge, skills, and experience of outstanding researchers with excellent credentials; shooting from the hip and designing by the seat of the pants doesn't take you very far while trying to design operating systems, communication protocols, microprocessors and so on; the most influential operating systems: IBM's OS360, Bell Labs' Unix, Digital Research's CP/M, and Xerox's Alto and Star were all written by or under the supervision of Ph.Ds. The same restriction doesn't seem to apply to applications that are built on those solid foundations. The last 15 years have shown that the best innovations in application-oriented technologies happen when useful technology building tools are widely distributed among the general population, whether or not they are technically qualified. There is an incredible amount of ingenuity out there that goes far beyond what might obtain within the walls of formally designated research institutions. Society is greatly benefited by seeding it with an array of Tools for Innovation: give 'em the Tools to Create and then leave them alone.

UPDATE (July 2, 2009): Twittering considered harmful. Ranging from musculoskeletal problems caused by repetitive motion to the more sinister Going-About-One's-Life-While-Distracted-By-Twittering resulting in accidents and serious injury. That, and psychological issues relating to addiction and withdrawal from the physical world.