It's no secret that Microsoft hires the best tech researchers out there by the truckloads, and quite possibly, has a research lab with an intellectual and creative heft that could compare with the legends of yore -- IBM's Watson, AT&T's Bell Labs, and Xerox's Palo Alto Research Center. Of course, it still continues to dish out stinkware such as Windows Vista, but that despite the efforts of its brains trust, much of whose work likely never sees the light of day. Perhaps in response to Apple's recent leak of a touch sensitive mouse, Microsoft has released a video of five different types of touch-oriented mice. The demo is really, really neat. I'd love to own one or more of those toys, just to play around.
Wednesday, October 07, 2009
Thursday, July 09, 2009
Wednesday, July 08, 2009
Sunday, July 05, 2009
From the text at Monsters And Rockets:
In 1946 legendary surrealist Salvador Dali formed an unlikely friendship with Walt Disney, and they spent some time collaborating on a short film called Destino. Dali and Disney artist John Hench worked on a lot of storyboards, but only 18 seconds of test footage were shot before the project was abandoned.
In 1999, Disney's nephew Roy Edward Disney was working on Fantasia 2000 and he decided to complete the Destino project, over 50 years after production began. 25 Disney artists worked from the original storyboards (with some input from Hench himself, and notes from the journals of Dali's widow) and finally completed Destino using a mix of hand-drawn and computer animation. The 18 seconds of test footage were included, in the shots of the two weird, turtle-like creatures seen above.
Destino didn't end up as part of Fantasia 2000 and hasn't been widely screened. It was seen in theaters with the films Calendar Girls and Triplets of Belleville, but so far it hasn't been released on DVD. It's amazing to look at, but I have the feeling that the imagery in Dali's own version would have been a bit more disturbing. (Notice how those turtle monsters kind of stand out from everything around them?) It's also a little funny how the Disney artists just can't resist making the dancing girl into a Disney princess. There are a few shots in here that look a bit like Belle in Dali Land.
Saturday, July 04, 2009
Over and over again, I see writers -- who ought to know better -- being unable to distinguish between Communism and Community. Communism is where the State arrogates all power to itself and in fact, does everything to destroy community. The Communist State would like to make like to make every citizen completely dependent on it for providing everything. The State decides what your duties and responsibilities are and also provides for all your needs. In theory. Community, on the other, is a phenomenon that emerges through natural processes from the grassroots, when a group of people find themselves in a situation where they are faced with a common fate. Community is not imposed from on high, unlike a Communist State.
Open Source and Social Media represent and encourage community-oriented phenomena, not Communism, as their detractors often allege. Both the Totalitarian State and the Powerful Corporation are inimical to the welfare of the Community. Communities should strive to make the State and the Corporation subserve, rather than rule over them.
Friday, July 03, 2009
Thursday, July 02, 2009
Is it over, already? Man, I don't believe it ... maybe I'm just burned out a little, need to take a break from Twitter. Yeah, there've been a few good links to look up, and some nice, pithy sayings that have been keepers. I've linked up with some interesting people, and had a few Twitterversations. Still, I feel a bit jaded. Not sure why.
Though I'm 'following' nearly 190 persons, it feels as though most of the posts come from a handful; almost certainly, not all 190 are posting, and even among those who are, it feels (that's my impression) as though no more than about 20 are doing so frequently.
I guess it feels like I'm meeting the same ol' folks saying the same ol' kinda things that I expect them to say, given that I'm beginning to discern a pattern for each. It's interesting that one unconsciously begins to construct a certain unique persona for each individual, as a gestalt of all their tweets. The tone, the content, the syntactic structure, the tweeting frequency, and the pattern of tweeting (quick bursts of many tweets, periodic tweeting at fairly regular intervals, or random intervals) as well as the avatar image used, all contribute to the complexion of the persona that emerges from the tweetstream.
Guess I need to follow some more people, although I wonder if I can handle that.
It is a cocktail party. Or maybe it is like one of the long train journeys we used to have when I was a child, which often took up to three days to complete. Along the way, we would befriend fellow passengers, have interesting conversations, share food, assist one another (especially in keeping an eye on kids and belongings) and in general have a great time. And then we'd get off the train and never see or hear from them again. Kinda sad, but one's life was enriched anyway, and it made for a memorable journey, making one eager to embark on another one ere too long.
Yes, Twitter feels a bit like that. You overhear scattered fragments of conversations among strangers and a fair bit of it is fun. There is bustle and noise, and a sense of movement. People stream in and out. Passengers get on and off. There are moments of quietude, and then bursts of activity and sound. Some stuff is funny, other stuff is dreary, a bit of it is boring, and then there is some pretty interesting stuff.
You'll likely never come to know people completely, just the side of them they choose to reveal during the journey.
The thing is, the train keeps going on forever, even if you have to get off at some place. And you know that the train didn't start at the place you got on, it's come from a long way off, been running for a long time, and it's final destination is way beyond where you will disembark. There are a lot of people on board, and you will never get to meet them all; and some you'll never want to meet.
Maybe I need to visit some other compartments (carriages) and wander around in previously unexplored parts of the train.
Yeah, that's what I'll do. Next time. Maybe. My brain's tired from all the listening and tweeting and trying to make sense of it all. Good night.
Posted by Murli at 7/02/2009 11:26:00 PM
Likely the term used most often in an unintentionally ironic manner is the word disintermediation. And in the past year or so, it has become among the top buzzwords in use, flung around freely from every pulpit everywhere, typically delivered in a booming, authoritative tone. Disintermediation is what the New Media or Social Media is all about, we are told. Disintermediation takes away the media so that there is nothing -- nothing -- that remains between you and the thing with which you choose to interact. Take this one, for instance:
Social media is a de-institutionalising and disintermediating force. It gets rid of institutionalised functions. This is the lesson from every sector it has touched. In music it has got rid of the music business (and the creation and sharing of music has flourished). In news it is getting rid of the news business (and the creation and sharing of information is flourishing). In government, logically therefore, it will get rid of the government business.Ye-ep! Social media disintermediates! You read that right! Now how in heck does it manage that? Perhaps in the manner a dog chases its tail or a snake swallows itself?
Web-ons of Mass Disintermediation: Once the world is completely disintermediated, then we shall all be free! Liberte! Egalite! Fraternite! Revolucion! Che! Etc.!Hey, it happened in Iran, didn't it?
A search on teh google for the term disintermediation came up with 231,000 hits [June 2, 2009; 8:09 am GMT]. Bing, on the other hand, delivered 208,000. The difference of 23,000 disintermediations seems to have gotten intermediated somewhere in the vast search space separating the two search engines. Interestingly, the Wikipedia definition came up first on both lists -- I'm figuring that Wikipedia is the most popular medium of disintermediation out there. [Disintermediate produced 55,000 hits on Google and 18,400 on Bing; disintermediating gave 61,300 and 12,400 on Google and Bing respectively. Wikipedia continued to rule.]
Google! Hey, there's a disintermediary (17,100 Google hits, but only 112 on Bing! Bing! and all the intermediaries crumble to dust), if there was one! Google eliminates everything that stands between you and the information you seek, right? Right?
A discussion of the larger problem with definitions and pronouncements is in order especially because of the currently raging controversy over the book, Free by Wired editor Chris Anderson, sparked off by a none-too-positive New Yorker review by superstar author Malcolm Gladwell and a plagiarism allegation. This controversy is being discussed all over the blogosphere (and is being disintermediated by Wikipedia even as we speak; 286,000 hits for the search string "chris anderson malcolm gladwell free").
The issue is what Anil Dash colorfully calls airport books: these are easy reading fare usually written intelligently and engagingly by brand name authors, and dealing with topics of broad current interest but written in a manner that is appropriate for cocktail party conversations. The material presented usually makes reference to scientific research and pithy, easily remembered (and quotable) conclusions are presented as if with authority; very often the language and phrases from the book quickly enter the general idiom and become part of folklore, accepted without the need for proof. Nevertheless, very often the pronouncements made are quite shallow and don't stand up to intense scrutiny. Cross-questioning by intellects of such stature as my hero Richard Feynman would likely cause the theories and pronouncement to crumble to dust instantaneousy. Indeed, the writers often take advantage of the fact that public memory is short and predictions made in the book are (to the relief of the authors) quickly forgotten, to make way for new ideas and new books (of limited shelf life) that come streaming down the airport aisles. The books, in fact, could be considered at best to be medium to high-brow entertainment, meant to tickle the mind and provoke thought and discussion. Unfortunately, especially due to the authors' reputations (and the extensive references in the books), they are often treated as having the force of real, formal scholarship, and in that sense serve only to muddy public discourse.
Back to the subject, though: I blame popular writers of airport books and similar blogs for having created this very flaky buzz around the idea of disintermediation. Apparently, New and Social Media eliminate the much reviled Middle Man thereby reducing costs, increasing transaction speed, etc. But is this really disintermediation? Let's see,
mediation: coming in the middle
media: something that comes in the middleHow in heaven can media disintermediate? The reality is that there will always be a medium of some sort whenever there are two or more parties involved in a transaction of any kind (physical or informational). Unless some means of effecting transactions is invented that instantaneously generates the required knowledge and information as well as goods and services inside a recipients brain and body respectively -- I don't see that happening anytime soon. So,
Please repeat after me: Media does not, and cannot disintermediate.
So may I propose a new, more appropriate and relatively neutral term that correctly captures what's going on, i.e., the replacement of one set of media with another set of potentially more effective and efficient media? Well, ladies and gentlemen, boys and girls, here it is:
RE-MEDIATION or REMEDIATIONBut ... but ... but ... you say, that can't be, this ain't a remedy, that term's already taken, and it means something else. How about,
NEOMEDIATION or NEWMEDIATIONDon't lilke that, eh? Disintermediation rolls off the tongue, with so much gravitas and authority, that one is loath to give it up. Consider, then,
BENEMEDIATIONthe bene- prefix meaning "good" in Latin. Or even,
NOVAMEDIATIONHey, I think I've got it!
TRANSMEDIATION: you know, as in TRANSformed MEDIATION?Anyway, dear reader, think up some of your own. But for heaven's sake, please reconsider your indiscriminate employment of the word disintermediation for purposes that have everything to do with mediation.
Wednesday, July 01, 2009
So I figured I needed to know what this Twitter thing was all about. Yeah, I had a Twitter account (@murliman)already, got one a long while ago, needed to feel I was cool and all; but after sending out a tweet or two, I couldn't really figure out what this was all about and abandoned further efforts. Here's my first ever tweet:
11:10 AM Dec 16th, 2007 from webNote the date on that. My second tweet is identical, sent at the same time. Must have goofed up, I guess. My third tweet came more than 7 months later:
Wondering why the Twitter logo(?) is what it is (o_O) Are those the eyes of a kinda stoned birdie?
10:54 PM Jul 29th, 2008 from webAnd that was it.
So my Twitterscape lay fallow, unhonored, unsung, unploughed. Turned out that I was something like the first guy with a telephone or email: nice, but it all seemed pretty pointless. Things turned around in early June this year. I had started blogging again - like mad - the blame for which falls entirely at the feet of my former student, Ashish Sharma (Twitter handle: @ashinertia). I figured I needed to get word out about my blogposts and reasoned I could use Twitter for that; after all there were probably a billion Twitterers out there. At the time, I had a small (single-digit) number of 'followers' and at least they could come to know about my blogs thus. My tweeting resumed with this tweet:
4:20 PM Jun 6th from webI still didn't know what Twitter was all about, but figured I'd just jump in and find out. In this quest I was aided by @marcynewman who tweeted me, saying, 'Hey, you're tweeting!' or something like that. To which I responded:
@marcynewman :-) just experimenting; was getting frustrated that I didn't know how to use this dang thing. You UberTechi, you!
12:18 AM Jun 7th from web in reply to marcynewmanNow @marcynewman is a major techie even though she claims I'm the one who made her technical and all. She is the quintessential technobabe which term I first heard used by my former colleague Robert "I'm not Bob" Minch when referring to someone we mutually knew. @marcynewman's body is covered with technology; she could walk onto a sci-fi movie set without any additional props or gear and fit right in.
For the next one week I posted 17 tweets ('updates' in Twitterish), 7 of which were a conversation between myself and @marcynewman (which could have happened through chat or direct messaging; still trying to figure out Twitter) and the remaining contained links to blogposts I had created; yes, I was blogging like crazy, compared to the previous years.
Then it all died out as abruptly as it had started.
I'm told this is a standard pattern among Twitterers; the Twitterscape is littered with millions of abandoned @names. But then another thing happened; my son was done with his examinations, and I was free. After that my tweeting began in right earnest all over again, about ten days ago, on June 22, since when I've sent out over 120 tweets; also, quite happily, I am now a certified twunkie (tip o' the hat to @durrink for that term) -- a Twitter Junkie. I think I am beginning to get it and I'll try to present Twitter according to @murliman (tat@m?).
- you're a celebrity and your fans out there are dying for a few morsels now and then
- you like to be stimulated by random, serendipitous posts from friends or strangers
- there's some major event happening, e.g., the Iran Revolution, the Mumbai Terror Attack, a conference or a ball game, and you'd like to know what's going on (or if you're in the thick of things, even send out tweets yourself)
In some ways, Twitter was a clean sheet restart of electronic communications. Email, blogs, and Facebook had been around for a while, but they had become much too baroque, too overladen with features. They had become bulky, unwieldy, complex. Twitter was a way to return to the drawing board and start from scratch all over again.
The SMS framework in mobile phone communications seemed like a good starting point. There was email, which required computers (or smart phones) and there was SMS (for mobile phone communications). Twitter's founders thought -- how about seamlessly combining mobile phone and computer communications by employing the lowest common denominator -- SMS -- as the messaging structure?
Think about it: It takes two to SMS -- or Tweet: a sender and a receiver. Let's assume these are persons known to each other. Perhaps the tweets are perishable, standalone, and have no further value. There is no need to save or organize the tweets. On the other hand, perhaps the tweeting constitutes a conversation, and there is value in preserving them, much like email messages. If so, the only structuring mechanism needed to manage the tweets is to organize and list them in chronological order. It might also be useful to list each Twitterer's messages separately (and in chronological order). So there are three lists: the complete list of Tweets in chronological order and two lists listing only each Twitterer's Tweets (the latter two, of course, do not need to be separately maintained by can be generated dynamically from the first on demand).
Now, friends and family come to learn about this new messaging medium, and want in on the action. The total population of Twitterers grows, say to about 5. This set of five persons are all know each other, and would like to keep up with each others' tweets. It's personal, or work related, but it keeps them connected. The same structuring mechanism (chronological, and by Twitterer) suffices.
Then others begin to join the Twitterscape -- friends, and later, friends of friends, friends of friends of friends ... ad nauseum. Pretty soon, the number of Twitterers is in the hundreds - or thousands. Everybody no longer knows everybody else. Most have interest in the tweets of only a subset of the Twitternation. And so the idea of a Follower is introduced: each Twitterer chooses to follow some subset of Twitterers in the Twitternation. This select subset (unique to each Twitterer) is made up individuals known as Friends (not to be confused with Real friends in the Real world).
So where there once was a single, completely-connected cluster of Tweeters, there now are tens of millions of clusters, which in turn are connected to other clusters. Each cluster represents one individual Twitter and his/her Followers. That Twitterer, in turn, is a Follower (and hence a member) of many other clusters.
What, if any, change is needed to structure and organize Tweets (at least as viewed by a Twitterer)? No longer is it feasible or useful for any Twitterer to view the entire tsunami of Tweets generated by the entire Twitternation of tens (or hundreds) of millions: you view the Tweets of only those you follow. Following, then, is the structuring mechanism for bringing down an individual's Tweetupdates down to a manageable number (besides reducing the bandwidth consumption on the Internet by several orders of magnitude).
What other structuring mechanisms are available besides View by Individual Twitterer's Tweetstream and View by a Friend's Tweetstream?
#searchterm'searchterm' is the string one is searching for. Twitterers developed this as a means of tagging and searching for all Tweets relating to a particular issue, e.g.,
#iranelection or #pdf09where pdf09 is the name of a recent conference. All tweets that include a specific hashtag are listed in reverse chronological order. While it is possible to search for any character string in Twitter, using a '#' prefix implies that the Twitterer deliberately intended for it to show up in a search. Typically, there is a consensus to employ a specific character string since the Twitter system does not create separate forums to deal with specific issues or events.
That last sentence is important: other social media deliberately create walls and fences for discussions to proceed and socialization to occur in finite forums with defined memberships. Twitter, on the other hand, erects no walls and intentionally permits serendipitous discoveries. Being a Twitterer is akin to wandering into a vast mall or commons, bumping into friends and strangers, chatting with some, overhearing conversations, making general pronouncements that some might hear and most might ignore. It's a like a never-ending cocktail party on a monstrous scale. There are some who successfully bring their real-world celebrity status or recognition into the Twitter. And there are others whose real-world status may be markedly different (better or worse) from the one enjoyed (or deplored) by their TwitterPersona. Take the case of this musician who earned $19,000 in just ten hours from her Tweets. Certainly better than anything she had managed in real life until then.
I, for one, am enjoying the ride so far. It tickles me to read Tweets coming in from persons of the stature of Arnold Schwarzenegger, Nandan Nilekani, Al Gore and for gosh's sake, Jack Welch! I felt Twitter had arrived the moment I found a post from Welch.
The question now is, what other ways might there be to structure the basic Twitter stream? The need for alternative structuring mechanisms is already evident in the large number of Twitter applications that have emerged during the very short life of Twitter, and more are in the pipeline. Twitter is like a vast, constantly changing terrain and Twitters soon learn that they need for making sense of this relentlessly transmogrifying space; they need maps of some kind, and the various Twitter apps help in both making sense of the Twitterscape as well as negotiate it successfully. Within a week after diving into the TwitterCloud, I felt disoriented enough to feel the need to scan the web and download tools for managing the process. I am now using two fine, highly-recommended Twitter desktop apps Tweetdeck and Seesmic Desktop, both constructed using Adobe Air, the Rich Internet App (RIA) development platform.
I can see how the TwitterStream can be the basic building block of just about any communication-based application, mostly involving humans, but not necessarily so: I can conceive of embedded digital devices 'friending' and 'following' other Twittering digital devices, or even humans exchanging Tweets with machines (the machines, of course, parse the Tweets and take appropriate actions). Hence Twitter can become a universal communication infrastructure at a level just above machine communication but low down on the hierarchy of human communication.
It helps Twitter's case that it is addictive: one blogger has found the need to publicly lament that his Twittering has left him little time to blog and that he was going to have tear himself away from Twittering for the purpose.
One broader observation: I have been associated with what is now called social media but used to be known by various names such as: groupware, computer support for cooperative work, group support systems, etc., for over two decades, and have contributed to formal academic research in the field. In all these years, however, there is not one social application developed by researchers in universities and corporate labs that has found widespread acceptance among the general public. The most wildly popular social applications such as Facebook, Twitter, and Blogger were developed by young, imaginative, energetic persons with no pretensions to doing research: they built tools that they found useful to themselves, and happily, tens of millions of others found them useful too. None of these social apps came out fully developed or with any coherent theoretical model, as academicians insist on creating before they build and explore tools and applications. The apps came out completely from the unique, idisyncratic experiences of a few individuals; surprisingly, they also matched the needs of the many. Over time, with feedback from users, the tools rapidly evolved. In all these instances, theory appears to follow, rather than lead phenomena. There must be a whole lot of researchers trying to figure out why Facebook and Twitter have become the monsters they are now, but none of them could have anticipated them based on any available theoretical framework. Clearly, the current social and social psychological theories are flawed, or limited and need to be reviewed and revised.
But the situation also calls into question the value of academic research in the field designed to produce new social applications (rather than investigate the impacts of extant social applications). It is humbling to realize that there is little to show for over 25 years of formal university and corporate lab research and development in the design and introduction of social media.
I ought to emphasize that the development of core technologies such as operating systems demand the knowledge, skills, and experience of outstanding researchers with excellent credentials; shooting from the hip and designing by the seat of the pants doesn't take you very far while trying to design operating systems, communication protocols, microprocessors and so on; the most influential operating systems: IBM's OS360, Bell Labs' Unix, Digital Research's CP/M, and Xerox's Alto and Star were all written by or under the supervision of Ph.Ds. The same restriction doesn't seem to apply to applications that are built on those solid foundations. The last 15 years have shown that the best innovations in application-oriented technologies happen when useful technology building tools are widely distributed among the general population, whether or not they are technically qualified. There is an incredible amount of ingenuity out there that goes far beyond what might obtain within the walls of formally designated research institutions. Society is greatly benefited by seeding it with an array of Tools for Innovation: give 'em the Tools to Create and then leave them alone.
Tuesday, June 30, 2009
When the machine is connected to an expanded online catalog of titles later this year, Morrow said, the bookstore will be able to offer customers an “ATM for books’’ that will provide access to millions of works.“The idea is that soon we’ll be able to print out any book that’s ever been printed,’’ he said. “That could really change people’s image of the small bookstore.’’...In its first year, Northshire’s book machine printed dozens of original books by customers, including memoirs, autobiographies, poetry collections, and cookbooks, usually producing from 30 to 50 copies of each. The bookstore also published a young adult novel written by a local 12-year-old and a previously out-of-print guide to Manchester.Self-publishers pay a $49 setup fee and a per-page rate that ranges from 5 to 9 cents, depending on the length. Northshire provides an a la carte menu of editorial and design services from a network of providers. Copy editing costs 1 cent per word; book design services, $40 an hour....Rodefeld, a former graphic designer who works at a tiny desk next to the Espresso machine, produces up to 35 books a day. “It’s exciting to see an author’s face when I hand them the first book off the press,’’ she said. “To see the dream, the fantasy, become a reality - that really tickles me. I get to be Santa Claus all the time here.’’...The numbers at Northshire Bookstore, Morrow said, are “on the cusp’’ of working out. The big payoff will come, he said, when the Espresso machine is seamlessly connected to the entire universe of books, allowing the store to fulfill any request in minutes.
Great story! Probably apocryphal, but the kind of story I want to believe because I love the man so much.
Monday, June 29, 2009
I gotta stop this, else I'll end up posting every dang video of Feynman here. What an incredible teacher! He breathed such life and color and intensity into every little idea about the universe. He must have received a standing ovation at the end of each of his lectures.
Sunday, June 28, 2009
With a nod to Daniel Pink where I found this video. Very, very neat -- especially the leftover box at the end. Just when you thought there no more innovation possible in this most minimal of designs comes this redesign which requires no additional material or even a significant change in the manufacturing process. And yet avoids using additional material to wrap any leftovers. Leftover box takes little room in a refrigerator. This is pure genius.
Posted by Murli at 6/28/2009 09:15:00 PM
To be honest, I never was one of your fans. And yes, over the last decade or two, whatever fame and adulation you may have garnered, was overshadowed -- with considerable support from a rapacious media -- by your eccentricities, of you which you had more than your fair share.
Friday, June 26, 2009
Every now and then a monumental event occurs, or a seemingly innocuous innovation enters human society which then dramatically alters its configuration, power structure, processes, communication and a whole lot else. It's hard to tell after the passage of many years how dramatic the impact was, but we live in a time when many such events occur.
The impact of the world wide web has not only been well-researched, it has been experienced all over the world. The web caused a very rapid, nearly discontinuous change in the way the peoples of the world generate exchange and absorb information. Many predictions about the world made before 1990 are practically worthless -- or change happened much, much earlier than may have been anticipated.
The digitization of music followed by the infrastructure to stream it over the web electronically without the need for permanent recording media like tapes, vinyl records or CDs has rendered the music publishing industry in its current form virtually superfluous. And the same impact is beginning to be felt in the book publishing business -- ironically, after, rather than before the music business, although text and graphics were available on the net much before music was.
And now, the Apple iPhone. According to one report, within a week of the introduction of the iPhone GS which is capable of recording video, there has been an incredible 400% surge in YouTube video uploads. Why? The iPhone makes it trivial not only to record, but also to edit and directly upload video from the phone, eliminating the intermediate step of transferring video to a computer, editing it there, and uploading it.
Again and again we see that one of the most common ways in which innovations transforming existing structures and processes is by eliminating intermediaries or intermediate steps. Telephones, email, personal vehicles, personal computers, TV ...
Forecasting is a chancy game in these times; most forecasters end up looking foolish, eventually. Who knew how popular was Michael Jackson? His untimely death is almost bringing down the internet with people communicating their grief, sharing stories and songs and celebrating his life.
Posted by Murli at 6/26/2009 11:39:00 PM
Thursday, June 25, 2009
You'd think success, even middle age, or Total Market Dominance -- or something -- would transform Microsoft. No such luck. The company that ripped off DOS (CP/M), Windows (Mac), Powerpoint (Persuasion), Internet Explorer (Mosaic), NT (VMS), and an almost endless list of technologies now rips off a small travel site called Kayak via its 'new/improved' Bing search engine. Take a look at the picture above and decide for yourself. And read the story too.
More than two decades ago when I was in graduate school, pursuing a PhD, I nervously carried a draft proposal of a topic that really excited me to a faculty member whom I looked upon as a potential dissertation advisor. A European with a reasonable command of English she was reputed to be sharp, cold, curt, and fastidious. She was all that and more. She spent no more than about ten minutes with me (she was extremely organized) and during that span, she browsed through my apology for a topic analysis, marked it all over in red ink, and left deep long gashes in it. With each stroke of her pen, my enthusiasm dropped several feet, and by the time I left it must have gone right through the ground and emerged from the other side of the earth. I don't recollect anything she scribbled on the paper other than the following words: "you are scatterbrained".
These sudden insights, they found, are the culmination of an intense and complex series of brain states that require more neural resources than methodical reasoning. People who solve problems through insight generate different patterns of brain waves than those who solve problems analytically. "Your brain is really working quite hard before this moment of insight," says psychologist Mark Wheeler at the University of Pittsburgh. "There is a lot going on behind the scenes."
In fact, our brain may be most actively engaged when our mind is wandering and we've actually lost track of our thoughts, a new brain-scanning study suggests. "Solving a problem with insight is fundamentally different from solving a problem analytically," Dr. Kounios says. "There really are different brain mechanisms involved."
By most measures, we spend about a third of our time daydreaming, yet our brain is unusually active during these seemingly idle moments. Left to its own devices, our brain activates several areas associated with complex problem solving, which researchers had previously assumed were dormant during daydreams. Moreover, it appears to be the only time these areas work in unison."People assumed that when your mind wandered it was empty," says cognitive neuroscientist Kalina Christoff at the University of British Columbia in Vancouver, who reported the findings last month in the Proceedings of the National Academy of Sciences. As measured by brain activity, however, "mind wandering is a much more active state than we ever imagined, much more active than during reasoning with a complex problem."
Apparently, people -- if you can call those over 35 that -- don't want to eat pizza anymore. At least, not the kind of junk served at Pizza Hut. This has the corporation's mandarins worried. They sat around a swank table and exclaimed, 'Holy crap, they don't want to eat junk anymore?! I wonder why?!' So they went and asked them (the 35-and-up geezers). And they said (and I quote):
one of the big things that would reignite their passion with the category is to have a pizza made with multigrain crust and an all natural tomato sauce...
... that ties in nicely with (today's) texting generation.
Wednesday, June 24, 2009
Twenty-five years ago, before we had Azim Premji and Narayanamurthy to inspire us; before Abdul Kalam fired our imaginations and became a household name; there was Sam Pitroda to provide leadership in advanced technologies to an emergent India. Pitroda grew up in a humble Gujarati family in Orissa and after moving to the US, made it big as an telecom entrepreneur, eventually becoming a US citizen. He was invited by then newly anointed Prime Minister Rajiv Gandhi to help navigate India into the 21st century on a technology platform. Pitroda was reputed to have turned in his US citizenship in order to serve as an advisor to Rajiv Gandhi.
Who would you vote to lead the corporations of the present into the future: suits who network with their peers at a country club, nursing a glass of Scotch, or folks in jeans (or even or suit, maybe?) connecting across the globe through social media (Facebook, Twitter, Blogs, wikis, etc.)? Okay, my bias is showing here, but so what? And granted, it takes a whole lot more than being social media savvy to run a company -- today. Jonathan Schwartz, CEO of Sun Microsystems, has had a blog for years, dang it, and yet couldn't do better than sell out to Oracle. Then again, maybe he was so savvy, he sold out, and that was the best thing anyone could have done in the prevailing circumstances.
- Only two CEOs have Twitter accounts.
- 13 CEOs have LinkedIn profiles, and of those only three have more than 10 connections.
- 81% of CEOs don't have a personal Facebook page.
- Three quarters of the CEOs have some kind of Wikipedia entry, but nearly a third of those have limited or outdated information.
- Not one Fortune 100 CEO has a blog.
Wikipedia had the highest level of engagement among the Fortune 100 CEOs, yet 28% of those entries had incorrect titles, missing information or lacked sources.
Reports coming in from across the web suggest that Palm's Pre smartphone is an outstanding device, and in many ways superior (slider keyboard, wireless charging, camera flash, background applications, web integration, removable battery, and slick, gesture-based user interface) to Apple's iPhone. While Apple's newly introduced iPhone 3GS ups the ante, there are sufficient reasons for people to stick with the iPhone. Back in the day, Apple's Macintosh, introduced in 1984, attracted a loyal following but the bulk of the market was owned by far inferior IBM/Microsoft PC, mainly for two reason: the lower price, and the availability of many applications.
If I were running Apple, I would milk the Macintosh for all it’s worth–and get busy on the next great thing. The PC wars are over. Done. Microsoft won a long time ago.
"The products suck! There's no sex in them anymore!"-- On Gil Amelio's lackluster reign, in BusinessWeek, July 1997
Tuesday, June 23, 2009
Life itself is a chancy thing, but like love, the pursuit of innovation, is among the most fickle and heartbreaking means to spend one's time on earth. And just as nobody ought to fall in love for any reason but to love, it is wise never to consider innovating unless the process of innovation is itself intrinsically appealing and satisfying to the heart. Whatever comes out of the process ought to be treated as a bonus.
America had been inspired by President Kennedy's wish, announced in 1961, of "achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to Earth." After his assassination in 1963, the idea became a homage to him, a way of showing the world what the United States would have achieved had he lived. Within days of the Apollo 11 astronauts' safe return to Earth, someone put a message on Kennedy's grave: "Mr President, the Eagle has landed." Job done, in other words.
It was also an extraordinarily expensive project, it should be noted. The entire Apollo programme cost $24bn in 1960s money - around $1 trillion in today's - and for several years was swallowing up almost 5 per cent of the US federal budget. In addition, there was also a considerable emotional cost to the missions, a point stressed by Christopher Riley, co-producer of the 2007 documentary In the Shadow of the Moon. "A great many Americans suffered premature heart attacks and strokes from their efforts in making the Apollo project succeed. More than 400,000 workers were employed by private contractors to build capsules, rocket engines, space suits, and computers for Apollo and the vast majority worked flat out, over weekends and holidays, much of the time for free, for several years to make sure the programme succeeded."For example, at the Grumman factory in New Jersey, where the lunar module was built, staff would clock off at 5pm, leave by the front door, walk round to the back and work for free until midnight. Similarly, employees at the International Latex Corporation - which made the suits worn by the Apollo astronauts - worked with equally obsessive intensity. In a recent documentary, the company's senior seamstress, Eleanor Foraker, recalled working 80-hour weeks without days off or holidays for three continuous years, suffering two nervous breakdowns in the process. "I would leave the plant at five o'clock in the morning and be back by seven. But it was worth it, it really was."
In the end, the real problem for Nasa is that it did the hardest thing first. Kennedy's pledge to fly to the moon within a decade was made when its astronauts had clocked up exactly 20 minutes' experience of manned spaceflight. "We wondered what the heck he was talking about," recalls Nasa flight director Gene Kranz. To get there before the Russians the agency was obliged to design craft that were highly specific to the task. Hence the Saturn V, the Apollo capsule and the lunar module. Unfortunately, these vehicles were fairly useless at anything else in space - such as building a space station - and Nasa, having nearly broken the bank with Apollo, had to start again on budgets that dwindled dramatically as the decades passed.
The conclusion is therefore inescapable. Kennedy's great vision and Armstrong's lunar footsteps killed off deep-space manned missions for 40 years - and probably for many decades to come. As DeGroot says: "Hubris took America to the moon, a barren, soulless place where humans do not belong and cannot flourish. If the voyage has had any positive benefit at all, it has reminded us that everything good resides here on Earth."
Blu-ray will doubtlessly continue to grow in popularity as more of us buy large HD-capable flat screen televisions. In the same vein, it’ll continue to make inroads in computing and video gaming markets. But it’s a case of too little, too late, as long-term trends point to a slower uptake than DVDs ever had. When we can simply download a good-enough copy of a movie from iTunes and save it to a USB drive or mobile device for viewing pretty much anywhere, why would we even bother with a power-hungry, noisy, expensive and frankly inconvenient disc in the first place?By the time most consumers have asked themselves this question, the answer will already be in: Optical discs are a fading technology, and investing in them now could be a shorter-term move than you might have initially anticipated.