The Taming of Data – On the Value Train from Insights to Knowledge to Wisdom to perhaps Happiness?

I recently attended the IBM #SmarterAnalytics Summit in New York City that focused on #Analytics and #Optimization. The sessions and the client panel in particular were superb and enlightening. Beyond, the typical discussions on technology, the IBM client panel repeatedly emphasized that organizational and cultural changes were critical to properly implement and integrate #Analytics and #Optimization as core business process.

This re-sparked a train of thoughts in my mind. I even got to test these thoughts a bit later at the evening reception. On my train ride back home, this train of thoughts on how to tame this avalanche of data for mankind’s (including corporations) benefit continued to escalate. I thought I should transcribe this train of thoughts quickly before it crashes and bursts into some forgotten cloud! For this my Cloud Mobile (iPhone) with Speech Recognition Software (Dragon) came to my rescue.

On Data, Words and Deeds, and Ephemeral Social Media

 
 
It’s well recognized by IT industry experts that data by itself has little value. It’s what you do with it that generates the value. It reminds me of Lech Walesa’s quote “The supply of words in the world market is plentiful but the demand is falling. Let deeds follow words now.” Or simply put, in an anonymous quote, “talk is cheap because supply exceeds demand”.

I am not suggesting that we clamp down on the supply of words. That would be tantamount to curtailing free speech. We must take a thoughtful approach and critically examine the hype around #Bigdata – primarily perpetuated by the IT industry for which, as an analyst, I am also guilty.

Also guilty – contributing to the excess supply of data – is the recent spate of growth of “unstructured” data: images, video, voice, pictures and others. Probably because many believe that “a picture is worth a thousand words”. And a video even more! Every time I hear this oft used cliché, I think REALLY? WHY? Why are we creating all these quantities of image/video data and spending our precious resources (our time) doing so? More importantly, why are we so enamored with transmitting this data to others?

Yes, new social media and the underlying technologies give each and every individual enormous capability for creative expression and even contribute to the overthrow of oppressive regimes e.g. the Arab Spring. But aren’t we collectively trampling on another form of creative expression – the thoughtful reflective kind by drowning each other in all this data? Or aren’t we being distracted by all these images that like fast foods fill us up to sated exhaustion but have very little nutritional value?

But Some Words do Matter. Some Words are better than Exabytes of Pictures (or Words) and they Persist!

 
 
Here are some poignant examples. This is what the great contemporary Scandinavian poet, Tomas Tranströmer (translated by Robin Robertson), wrote about words:

FROM MARCH 1979

Sick of those who come with words, words but no language,

I make my way to the snow-covered island.

Wilderness has no words. The unwritten pages

Stretch out in all directions.

I come across this line of deer-slots in the snow: a language,

language without words.

And the great 20th century Mexican poet, Octavio Paz (translated by J. M. Cohen), wrote:

CERTAINTY

If the white light of this lamp

is real, and real

the hand that writes,

are the eyes real

that look at what I write?

 

One word follows another.

What I saw vanishes.

I know that I am alive,

and living between parentheses.

Distinctive Numbers – God’s Equation Then and Now – Hey It’s All Just Zeros and Ones

 
 
Just like profound and wise words, there are some distinctive numbers (data) that also matter: zero and the imaginary number i and those irrationals Pi and e. And then there’s Euler’s God’s Equation of centuries back: e ^ i2π = 1. Thus, the “simplest” and most fundamental of all numbers (Numero Uno) is incredibly complex, made up of irrational, transcendent constants that extend to infinity. Now, the more contemporary version of God’s Equation (circa 2007) is the fourth album by the Norwegian progressive metal band Pagan’s Mind and contains video clips! But hey, today it’s all just digital data which are, at the end of the day, zeros and ones – the two most fundamental numbers. So why are we all making such a hoopla!

Because we must traverse that Divine Manifold from Data to Information to Insights to Knowledge to Wisdom and perhaps Happiness

 
 
Data is plentiful (all the data generated today can’t even be stored!), and left untamed is bound to be catastrophic. So we (corporations included) must harmonize all our assets and capabilities (people, process, data, technology, and culture) to navigate through this data onslaught and traverse the Value Train with the help of yet another God’s Equation: This new equation must transform Data to Information to Insights to Knowledge to Wisdom. One recent noteworthy technology asset for this journey to wisdom could be #IBMWatson.

That great wise soul, Mahatma Gandhi, once said: “Happiness is when what you think, what you say, and what you do are in harmony.” So the Happy (and Wise) enterprises of the future in our data-driven world will be those that can act and culturally transform themselves through change and a complete re-think of strategy – just like the IBM client panel repeatedly emphasized – those were divine words! And they matter! Act on them! The customer is always right!

Tracking an IT Analyst’s Journey on the Cloud Mobile: My Musings after Attending IBM Pulse 2012.

The one key takeaway for me from the conference was IBM’s message on the confluence of private secure clouds to support an increasingly mobile world – employees, clients, partners, and other stakeholders. This is the Cloud Mobile (think Snow Mobile). It has to be safe, secure, and comfortable yet must perform, scale and deliver a high quality of service. IBM unveiled a set of solutions to support this vision and you can get all the detail from IBM Pulse 2012.

Very early morning on March/7 at the hotel, after a shower, I turned on the TV news. I heard that Apple planned to announce the iPad3 later in the day probably with the same pricing as the previous iPad2. It was expected that the price for the iPad2 would be reduced. This irked me as I had just bought an iPad2 a few weeks before. But I quickly got over that as I had already obtained significant business value from my iPad2 investment. This iPad2 and my iPhone4 are my most valuable mobile devices. This fact would be further reinforced as that day’s events unfolded. While checking out of my hotel to return home to Connecticut, a train of thoughts on my Cloud Mobile began to evolve in my mind that I want to share with you.

The Cloud Mobile has enhanced many professional pursuits in differing ways

 
 
Gone are the days when Mathematics was largely a solo-sport and the primary tools were just paper, pencil, extraordinary rigor, and amazing individual imagination. In recent years, with the advent of the Internet and an unusual level of collaboration among mathematicians, it has increasingly become a team sport with just as much rigor and an even greater and more amazing group wisdom and imagination.

This has greatly advanced innovation and discovery in Mathematics even in such arcane areas as Number Theory that was once the province of individual brilliance. In fact, the famous Fermat’s Last Theorem was finally proved by Andrew Wiles in 1995 after centuries of sustained collaboration and inquiry. And, yes computers were partly used as tools to arrive at this result just as they were largely used to resolve the Four Color Theorem in 1976. With cloud computing, this level of collaboration will only increase. But then, in some sense, Mathematicians have always been on the cloud!

Painting/art is still largely a solitary activity with less technology impact. Yes there are new artistic areas impacted by technology and graphics but the most creative artists and painters still rely only on their traditional tools – canvas, paint, rigorous techniques, and an amazing imagination. And yes artists and painters are notorious for their nomadic and mobile lifestyles. They too have always been on the cloud!

While writers continue to primarily work solo, there is an increasing trend for them to work in groups particularly when creating complex technical or non-fiction content. Markup capabilities in modern word processors and capabilities in Google Docs further facilitate these group efforts particularly in the cloud!

But while painters and writers both possess amazing creative capabilities, they differ in at least one way – Can you imagine a painter giving up his/her brush to a collaborator to markup on his/her evolving work of art?!

Technologists and Engineers tend to innovate better in groups and through collaboration. In fact, the industrial revolution and the subsequent rise of today’s large corporations depended heavily on this group collaboration. Today, this collaboration extends to other stakeholders including suppliers, customers, investors, and business partners. And Engineering Clouds are being adopted in the Manufacturing industry to improve productivity in design and development. So they too are getting on the cloud!

My current profession – an IT Analyst – is a blend of several of the above professions. IT Analysts must possess the analytical rigor of the mathematician, the conceptual creativity of an artist, the story telling capabilities of a writer, and the knowledge of the technologist. Add to these, the experiences of a business professional – marketing, sales, management, etc. So naturally, IT Analysts should also benefit from the cloud!

How on March 7, the Cloud Mobile helped this IT Analyst

 
 
Just as I was finishing up my breakfast at the MGM Grand with some of my colleagues, I saw a missed call from one of my key clients responsible for Business Analytics. So as I took the cab to the airport, I called him back. He wanted to get an estimate of the size and growth of data in the financial services industry particularly financial markets. He had a good estimate of the total size and growth across all industries and had tried some internal sources but did not have an estimate for his particular area. I told him that I was on the road and will try to do some investigation and get back to him the following day.

Now my firm, unlike some other major analyst firms, does not routinely provide these types of market estimates. There are other firms that specialize in these studies and make these reports available to their clients. I do not have access to these reports. But often, there is a lot of information on the web that one could often piece together to arrive at an informed estimate to such questions. So after checking in at the airport, I pulled out my iPad2 – fortunately the airport had free Wi Fi access – and began searching the web. After about ½ an hour, I had some relevant pieces of helpful information but was still nowhere near an estimate. I was a little disappointed and was almost planning to give up temporarily.

But then suddenly, I remembered that I had downloaded a very comprehensive Big Data report written by a major Global Think Tank in 2011. This report was on my secure private storage cloud. I had always planned to read it but never got the time to do so. So with my iPad2, I connected to my secure private cloud (protected by two levels of security), and pulled in the report into my iPad2’s iBook format. Then I boarded the plane. And as the plane soared up above the clouds and the flight attendant announced that we could turn on electronic devices, I opened up the iPad2 and began reading the report.

In that report, after about three hours, I found the missing pieces of information in various places. Not only did I find the missing links to provide my client with an informed estimate, but I also read through this comprehensive Big Data report and was completely oblivious to the uncomfortable middle seat that I was sitting on. Now that’s a ton of business value made possible by the cloud!

The plane landed at Charlotte, NC where I had to transfer to White Plains, NY. I was keen on composing the email to my Business Analytics client summarizing how I had arrived at the informed estimate and the rationale. But I got hungry. So I had a nice hot and spicy Mexican meal at Tequileria at the Charlotte airport. After the meal, I boarded my next flight and slept through the short flight to my destination. The next morning, I sent the email to my client with the informed estimate and rationale.

The Advantages of a Private Cloud Mobile

 
 
IBM’s notion of providing clients capabilities to build and deploy secure private clouds and connect as needed to hybrid clouds should help security (but also very cost) conscious enterprise executives make the transition to the cloud to support their very talented mobile workforce. Beyond, the obvious transactional mobile use cases i.e. procurement, sales force automation, invoicing that improve operational efficiency, the Cloud Mobile can (as depicted in my own personal use case) facilitate a level of analysis, collaboration, productivity and innovation, that can be a source of significant competitive advantage for enterprises while nurturing their talented mobile knowledge workers.

There’s a reason why I did not put that Big Data report on a public cloud i.e. Apple’s free iCloud service. These reports and other similar content are my sources of competitive advantage and differentiation. I like to keep these secure and private and protected through several layers of security yet accessible on demand. Also, through this private cloud, I can regulate access to my many collaborators in the cloud!

Back to the Cloud Mobile. The Music and the Pulsating Moves at Pulse 2012 and More.

 
 
Maroon 5’s concert at Pulse 2012 indeed made the Cloud Mobile move like Jagger! This built on some amazing fluid cloud like dance moves we witnessed earlier in the day by a group called iLuminate. Musicians and performers too are on the cloud! Performers collaborate and rehearse constantly. They are constantly on the road and mobile. And while there are individual superstars, there is nothing like listening to a well-coordinated talented group either at a concert or in your Cloud Mobile (Automobile).

This weekend the weather was perfect in Connecticut. I had the great joy and pleasure of taking my younger twin son to his choir performance and concert in my Cloud Mobile (Car). Then we all witnessed the lovely performance of his dedicated choir culminating after weeks (and weekends) of group rehearsals and practice. It put this parent on the Cloud! And that feeling even the best IT Analyst can’t analyze! It can only be experienced – in the cloud!

Politcs 2012

That the Internet has changed everything is a truism. Some stunning – and sometimes alarming - take-aways from “Politics, Tech & Decision 2012,” the most recent Gotham Media Ventures panel discussion, bring this home with a bang.

  • Online presence is everything. The first hire in today’s political campaigns is the website team, not the campaign manager.
  •  

  • Privacy has become a quaint illusion. Database techniques now make it possible to serve and measure advertising and other messages to increasingly specific market segments. The next step in targeting will be knowing what you think right now. Target is already trying to understand and sell to pregnant women specifically in the 3rd trimester. Credit cards companies can predict divorce rates two years before they happen with 95 percent accuracy. They see changes in consumer patterns.
  •  

  • “We all live in a yellow submarine.” Personalization and digital targeting surround each of us in a membrane of filters so that there is less and less discursive conversation. We each tend to be always talking with like-minded individuals and to be less and less exposed to opposing views.
  •  

  • More money buys more influence than ever before. Super PACS in 2012 are dirtier and more powerful than ever before. They are funded by huge donations - $20 million – from really big donors. They often support shadow campaigns of tweets and viral videos that are user generated but paid for by PACs.
  •  

  • Polls no longer speak truth. Online polls are now skewed because of technological flaws. They’re all misleading. We each live in our own echo chamber (see the bullet before last). Automated polling has under 10 percent response; it’s illegal to call cell phones for research and 35 percent of people have no land lines.
  •  

  • One thing hasn’t changed. TV is still the most effective election all for all demographics, while social media are the most persuasive tools on issues.
  •  

  • Speed counts. The Internet has increased volatility to an unbelievable extent. Being nimble has become more important than planning. How can you anticipate a potential crisis? How can you respond in Internet time?
  •  

  • Two kinds of power. It’s become a bimodal world, where you either have to have the big donors locked up or have huge online broad-based support from celebrities and/or grassroots. That at least provides a ray of hope for the masses!

To give credit where credit is due: Richard Hofstetter, partner, Frankfurt Kurnit Klein & Selz was moderator. Panelists were: Michael Bassik, managing director and US digital practice chair, Burson Marsteller; Taegan Goddard, founder and publisher, Political Wire; Eason Jordan, former chief news executive, CNN founder and CEO, Poll Position, and Eli Pariser, board president and former executive director, Moveon.org. Frankfurt Kurnit hosted the event

The Strategic Importance of Technical Computing Software

Beyond sticking processors together, Sticky Technical Computing and Cloud Software can help organizations unlock greater business value through automated integration of Technical Computing assets – Systems and Applications Software.

Most mornings when I am in Connecticut and the weather is tolerable, I usually go for a jog or walk in my neighborhood park in the Connecticut Sticks. One recent crisp sunny fall morning, as I was making my usual rounds, I got an email alert indicating that IBM had closed its acquisition of Algorithmics – a Financial Risk Analysis Software Company and this would be integrated into the Business Analytics division of IBM. This along with a recent (at that time) announcement of IBM’s planned acquisition of Platform Computing (www.ibm.com/deepcomputing) sparked a train of thoughts that stuck with me through the holidays and through my to-and-fro travel of over 15,000 miles to India and back in January 2012. Today is February 25, 2012 – another fine day in Connecticut and I just want to finish a gentle jog of three miles but made a personal commitment that I would finish and post this blog today. So here it is before I go away to the Sticks!

Those of you who have followed High Performance Computing (HPC) and Technical Computing through the past few decades as I have may appreciate these ruminations more. But these are not solely HPC thoughts. They are, I believe, indicators of where value is migrating throughout the IT industry and how solution providers must position themselves to maximize their value capture.

Summarizing Personal Observations on Technical Computing Trends in the last Three Decades – The Applications View

 
 
My first exposure to HPC /Technical Computing was as a Mechanical Engineering senior at the Indian Institute of Technology, Madras in 1980-1981. All students were required to do a project in their last two semesters. The project could be done individually or in groups. Projects required either significant laboratory work (usually in groups) or significant theoretical/computational analysis (usually done individually). Never interested in laboratory work, I decided to work on a computational analysis project in alternate energy. Those were the days of the second major oil crisis. So this was a hot topic!

Simply put, the project was to model the flame propagation in a hybrid fuel (ethanol and gasoline) internal combustion engine using a simple one dimensional (radial) finite-difference model to study this chemically reacting flow over a range of concentration ratios (ethanol/gasoline: air) and determine the optimal concentration ratio to maximize engine efficiency . By using the computed flame velocity, it was possible to algebraically predict the engine efficiency under typical operating conditions. We used an IBM 370 system and those days (1980-1981) and these simulations would run in batch mode in the night using punched cards as input. It took an entire semester (about four months) to finish this highly manual computing task for several reasons:
 
 

  1. First, I could run only one job in the night; physically going to the computer center, punching the data deck and the associated job control statements and then looking at the printed output the following morning to see if the job ran to completion. This took many attempts as inadvertent input errors could not be detected till the next morning.
  2. Secondly, the computing resources and performance were severely limited. When the job actually began running, often it would not run to completion in the first attempt and would be held in quiescent (wait) mode as the system was processing other higher priority work. When the computing resources became available again, the quiescent job would be processed and this would continue multiple times until the simulation terminated normally. This back and forth often took several days.
  3. Then, we had to verify that the results made engineering sense. This was again a very cumbersome process as visualization tools were still in their infancy and so the entire process of interpreting the results was very manual and time consuming.
  4. Finally, to determine the optimal concentration ratio to maximize engine efficiency, it was necessary to repeat steps 1-3 over a range of concentration rations.

By that time, the semester ended, and I was ready to call it quits. But I still had to type the project report. That was another ordeal. We didn’t have sophisticated word processors that could type Greeks and equations, create tables, and embed graphs and figures. So this took more time and consumed about half my summer vacation before I graduated in time to receive my Bachelor’s degree. But in retrospect, this drudgery was well worth it.

It makes me constantly appreciate the significant strides made by the IT industry as a whole – dramatically improving the productivity of engineers, scientists, analysts, and other professionals. And innovations in software, particularly applications and middleware have had the most profound impact.

 
 
So where are we today in 2012? The fundamental equations of fluid dynamics are still the same but applications benefiting industry and mankind are wide and diverse (for those of you who are mathematically inclined, please see this excellent 1 hour video on the nature and value of computational fluid dynamics (CFD) - https://www.youtube.com/watch?v=LSxqpaCCPvY .

We also have yet another oil crisis looming ominously. There’s still an urgent business and societal need to explore the viability and efficiency of alternate fuels like ethanol. It’s still a fertile area for R&D. And much of this R&D entails solving the equations of multi-component chemically reacting, transient three dimensional fluid flows in complex geometries. This may sound insurmountably complex computationally.

But in reality, there have been many technical advances that have helped reduce some of the complexity.
 

  1. The continued exponential improvement in computer performance – at least a billion fold or more today over 1981 levels – enables timely calculation.
  2. Many computational fluid dynamics (CFD) techniques are sufficiently mature and in fact there are commercial applications such as ANSYS FLUENT that do an excellent job of modeling the complex physics and come with very sophisticated pre and post processing capabilities to improve the engineer’s productivity.
  3. These CFD applications can leverage today’s prevalent Technical Computing hardware architecture – clustered multicore systems – and scale very well.
  4. Finally, the emergence of centralized cloud computing (https://www.cabotpartners.com/Downloads/HPC_Cloud_Engineering_June_2011.pdf ) can dramatically improve the economics of computation and reduce entry barriers for small and medium businesses.

One Key Technical Computing Challenge in the Horizon

 
 
Today my undergraduate (1981) chemically reacting flow problem can be fully automated and run on a laptop in minutes – perhaps even an iPad. And this would produce a “good” concentration ratio. But a one-dimensional model may not truly reflect the actual operating conditions. For this we would need today’s CFD three dimensional transient capabilities that could run economically on a standard Technical Computing cluster and produce a more “realistic” result. With integrated pre and post processing, engineers’ productivity would be substantially enhanced. This is possible today.

But what if a company wants to concurrently run several of these simulations and perhaps share the results with a broader engineering team who may wish to couple this engine operating information to the drive-chain through the crank shaft using kinematics and then using computational structural dynamics and exterior vehicle aerodynamics model the automobile (Chassis, body, engine, etc.) as a complete system to predict system behavior under typical operating conditions? Let’s further assume that crashworthiness and occupant safety analyses are also required.

This system-wide engineering analysis is typically a collaborative and iterative process and requires the use of several applications that must be integrated in a workflow producing and sharing data. Much of this today is manual and is one of today’s major Technical Computing challenge not just in the manufacturing industry but across most industries that use Technical Computing and leverage data. This is where middleware will provide the “glue” and believe me it will stick if it works! And work it will! The Technical Computing provider ecosystem will head in this direction.

Circling Back to IBM’s Acquisition of Algorithmics and Platform Computing

 
 
With the recent Algorthmics and Platform acquisitions, IBM has recognized the strategic importance of software and middleware to increase revenues and margins in Technical Computing; not just for IBM but also for value added resellers worldwide who could develop higher margin services in implementation and customization based on these strategic software assets. IBM and its application software partners can give these channels a significant competitive advantage to expand reach and penetration with small and medium businesses that are increasingly using Technical Computing. When coupled with other middleware such as GPFS and Tivoli Storage Manager and with the anticipated growth of private clouds for Technical Computing, expect IBM’s ecosystem to enhance its value capture. And expect clients to achieve faster time to value!

No Apology for High Performance Computing (HPC)

A few months back, at one of my regular monthly CTO club gatherings here in Connecticut, an articulate speaker discussed the top three IT trends that are fundamentally poised to transform businesses and society at large. The speaker eloquently discussed the following three trends:
 

  • Big Data and Analytics
  • Cloud Computing
  • Mobile Computing

I do agree that these are indeed the top three IT trends in the near future – each at differing stages in adoption, maturity and growth. But these are not just independent trends. In fact, they are overlapping reinforcing trends in today’s interconnected world.

However, while discussing big data and analytics, the speaker made it a point to exclude HPC as an exotic niche area largely of interest to and (implying that it is) restricted to scientists and engineers and other “non-mainstream” analysts who demand “thousands” of processors for their esoteric work in such diverse fields as proteomics, weather/climate prediction, and other scientific endeavors. This immediately made me raise my hand and object to such ill-advised pigeon-holing of HPC practitioners – architects, designers, software engineers, mathematicians, scientists, and engineers.

I am guilty of being an HPC bigot. I think these practitioners are some of the most pioneering and innovative folk in the global IT community. I indicated to the speaker (and the audience) that because of the pioneering and path breaking pursuits of the HPC community who are constantly pushing the envelope in IT, the IT community at large has benefited from such mainstream (today) mega IT innovations including Open Source, Cluster/Grid computing, and in fact even the Internet. Many of today’s mainstream Internet technologies emanated from CERN and NCSA – both organizations that continue to push the envelope in HPC today. Even modern day data centers with large clusters and farms of x86 and other industry standard processors owe their meteoric rise to the tireless efforts of HPC practitioners. As early adopters, these HPC practitioners painstakingly devoted their collective energies to building, deploying, and using these early HPC cluster and parallel systems including servers, storage, networks, the software stack and applications – constantly improving their reliability and ease of use. In fact, these systems power most of today’s businesses and organizations globally whether in the cloud or in some secret basement. Big data analytics, cloud computing, and even mobile/social computing (FaceBook and Twitter have gigantic data centers) are trends that sit on top of the shoulders of the HPC community!

By IT standards, the HPC community is relatively small – about 15,000 or so practitioners attend the annual Supercomputing event. This year’s event is in Seattle and starts on November 12. But HPC practitioners have very broad shoulders and with very keen and incisive minds and a passionate demeanor not unlike pure mathematicians. Godfrey H. Hardy – a famous 20th century British mathematician – wrote the Mathematician’s Apology – defending the arcane and esoteric art and science of pure mathematics. But we as HPC practitioners need no such Apology! We refuse to be castigated as irrelevant to IT and big IT trends. We are proud to practice our art, science, and engineering. And we have the grit, muscle and determination to continue to ride in front of big IT trends!

I have rambled enough! I wanted to get this “off my chest” over these last few months. But with my dawn-to-dusk day job of thinking, analyzing, writing and creating content on big IT trends for my clients; and with my family and personal commitments, I have had little time till this afternoon. So I decided to blog before getting bogged down with yet another commitment. It’s therapeutic for me to blog about the importance and relevance of HPC for mainstream IT. I know I can write a tome on this subject. But lest my tome goes with me unwritten in a tomb, an unapologetic blog will do for now.

By the way, G. H. Hardy’s Apology – an all-time favorite tome of mine – is not really an apology. It’s one passionate story explaining what pure mathematicians do and why they do it. We need to write such a tome for HPC to educate the broader and vaster IT community. But for now this unapologetic blog will do. Enjoy. It’s dusk in Connecticut. The pen must come off the paper. Or should I say the finger off the keyboard? Adios.

The US Healthcare System – One Big Tax on the Economy – Beyond Costs and Operational Efficiencies – Innovation is Critical – Technology Helps.

It’s well known that the US Healthcare costs are skyrocketing. Estimates range from 15%-20% of US GDP – greater than any other developed nation in the world. Left unchecked, this will be a big burden that today largely falls on US employers and businesses. And these businesses have to pass on these costs to their customers, making them cost uncompetitive in an increasingly globalized world. I found the following recent articles very illuminating in describing the challenges in US Healthcare and the implications of globalization:

  1. The Big Idea: How to Solve the Cost Crisis in Health Care, Robert S. Kaplan and Michael E. Porter, Harvard Business Review, September 2011.
  2. The Risks and Reward of Health-Care Reform, Peter Orzag, Foreign Affairs, July/August 2011.
  3. How America Can Compete – Globalization and Unemployment, Michael Spence, Foreign Affairs, July/August 2011.

But the big question is what each of us can do individually, collectively in an organization, and in our ecosystem across organizations – nationally and globally.

On a recent weekend, on October 1, I attended a talk by Dr. Atul Gawande sponsored by the New Yorker magazine and IBM. This was preceded by an exclusive breakfast meeting with Atul. I was fortunate to be invited and I thank IBM for a very gracious invitation to this event hosted by Dr. Paul Grundy of IBM who is also President of Patient-Centered Primary Care Collaborative. At breakfast, I also got to spend some quality time with the publisher of the New Yorker and other doctors (all medical – not like the Poor Hungry Doctor (Ph. D.) kind, like yours truly!) who are all facing these challenges of the US Healthcare system.

During the breakfast event and the subsequent talk, much of the emphasis was on reducing costs and improving operational efficiencies in the US Healthcare system. Dr. Gawande was very effective in conveying his path breaking ideas on how checklists and coaching can greatly improve a surgeon’s performance and result in far better patient outcomes.

Dr. Gawande started with the premise that we all reach a plateau at one point or the other in our lives and careers. And as we push ourselves to become better at what we do, the marginal benefits of our efforts seem to be all for naught. So what can we do? How can we increase our operational efficiency? His recipe marries continuous learning with coaching.

I encourage everyone interested in this subject to read his recent article in the New Yorker and also his book on checklists. His book also covers other professions beyond surgeons including architects, athletes, etc. It stresses that in-addition to continuous learning throughout one’s life, a coach is an essential partner for continuous self-improvement in any profession particularly those that are knowledge based. This clearly includes mine – an Information Technology (IT) analyst and entrepreneur.

As IT professionals, our lives have become complex and is today’s harsh reality. We all have to do more with less as we all have less time and leaner budgets. And yet we also have to do more with more as we are drowned in data, interruptions, and regulations. This more or less is driving us nuts. Everything is escalating at a frantic rate and pace while margins continue to dwindle. We are constantly challenged to improve every day operationally in what we do.

Part of the problem is IT itself. IT in some ways has caused this problem and I think IT is also part of the solution. I constantly ask myself these reflective questions: Is speed a virtue? Is Big Data really that useful? Is constant improvement always better? I think the answer to these questions is the proverbial “Yes and No” which drives me further nuts. Being an engineer, I like the determinism of a precise unambiguous answer. I like the precision of checklists but clearly also appreciate the value of coaching! So it is Yes and No for me now on these philosophical issues.

While IT has made a very positive impact on improving the operational efficiencies of healthcare, also required are process innovations (some IT-enables and others require business incentives). In fact, in response to a question from the audience, Atul gave an example of how a surgeon in his hospital was able to take a standard but lower cost surgical gauze and then cut it so that it would be better fit for purpose or tuned to task rather than using the more expensive pre-cut gauze. This adjusted process was then adopted by several surgeons in the hospital resulting in substantial savings in operational costs while improving patient outcomes. This was clearly a business process innovation!

But IT must itself be tuned to task and fit for purpose. In short IT must become Smarter. It’s what IBM calls Smarter Computing. With Watson and other related Smart IBM efforts and with fostering collaboration the healthcare ecosystem (Dr. Grundy’s efforts), IBM is providing the incentive and impetus needed to help address the challenges with the US Healthcare system. With events such as the one on Oct/1, IBM and its partners are providing the mentoring and coaching for everyone touched by the healthcare system!

It Takes Two To Tango!

A classic mistake for start-ups is to ask one name to fly solo. It’s usually the product name. Since entrepreneurs have to be obsessed with their product to start a business, that’s probably to be expected. But then how can you be sure you’re talking about the company when it has the same name as the product? Which brand promise do you want to imply or express?

Apple Computer owes its name to a small apple farm where Steve Jobs spent time each year with friends in the mid-70s. And it was the name of both the product and the company until the Lisa, Mac and other new products came along. That’s the typical pattern for start-ups. It was also a long time ago in terms of today’s marketplace.

Today you need all the brand power you can get to claim and hold a place in customer minds, and you need this the minute you start marketing.

Products can be treated as brands - given proprietary names and a brand platform as the backbone of marketing communications efforts. Or they can be given descriptive names and associated with a brand. But they’re missing out if they are denied the halo effect of a corporate brand.

The corporate brand is the face of a business strategy – what the company wants to be known for. In time, it becomes the organizing principle that simplifies the complexity of multiple products and the umbrella that facilitates new product acceptance.

The cost need be no greater for two than for one if you do it right – and you’ll build a far stronger foundation for ongoing sales and profits with both.

Software Everyware – Hungry or Happy?

I recently attended the IBM Innovate conference as an IBM guest analyst. At the outset, I must thank IBM – especially their outstanding Software IT analyst team – for being an excellent host and providing us a forum to get a lot of valuable technical and business information on the IBM Rational portfolio of solutions targeted at developers and the IT community. The overarching theme was Software Everyware.

As I returned back toConnecticut, this theme got me thinking. Those of you, who know me, know that I am a foodie. Those that know me better also know that not only do I relish good food, but also like to sample and customize it and get it “at a whim” when traveling or on the road. My close friends and family often think that my whole world revolves around food! And travel itineraries are purposefully built for this!

So during one of the round tables when an IBM executive painted an analogy of integrated software solutions with being Hungry and wanting food, it resonated very well with me. The scenario he painted went as follows: Imagine you are driving and you want to stop to get some food. Then using Yelp on your smart phone, you get a list of nearby restaurants serving close to what you are yearning for and you read the reviews, etc. Then using Groupon, you can check if there are any coupons that could be used, then using the GPS you arrive at this restaurant and have a good meal that makes you Happy and sated! This is great but can be better!

Now, taking this further, he said, imagine if all of this was integrated, and all you do is press a Hungry button – similar to the Staples Easy button. And voila, you get all these processes and applications integrated and you arrive at the restaurant with less manual action on your part. Perhaps with a meal ordering system integrated, you could start munching your delicious meal as soon as you arrive at the restaurant – a classic Just in Time (JIT) system! This could make you even more Happy!

So Software Everyware allows you to collaborate, integrate, and innovate! And yes, become Happy faster while minimizing manual effort!

But innovation is not just about technology or products but rather about the careful design and optimization of the business with people, processes, policies, and partners with purpose, passion, persistence, and perspiration! This is what I witnessed at the IBM Rational Innovate conference. Beyond Software Everyware, it was also Happy Everyware!

I went in Hungry to learn and returned Happy! And I didn’t press any buttons! My world has become better!

OPEN VIRTUALIZATION ecosystem continues to gather momentum – New KVM Alliance

Today’s enterprise data center crisis is largely caused by the sprawl of under-utilized x86 systems, ever escalating electricity costs, and increasing staffing costs. Using virtualization to centralize and consolidate IT workloads, many organizations have significantly reduced their IT capital costs, reduced operational expenses, improved IT infrastructure availability, and achieved better performance and utilization.

Last month, Red Hat, Inc. (NYSE: RHT) and IBM (NYSE: IBM) announced that they are working together to make products and solutions based on KVM (Kernel-based Virtual Machine) technology the OPEN VIRTUALIZATION choice for the enterprise. Several successful deployment examples i.e. the IBM Research Computing Cloud RC2 and BNP Paribas were highlighted.

Subsequently, later in the month, BMC Software, Eucalyptus Systems, HP, IBM, Intel, Red Hat, Inc., and SUSE today announced the formation of the Open Virtualization Alliance, a consortium intended to accelerate the adoption of open virtualization technologies including KVM.

The benefits of KVM (https://www.cabotpartners.com/Downloads/IBM_Linux_KVM_Paper.pdf) include outstanding performance on industry standard benchmarks, excellent security and reliability, powerful memory management, and a very broad support for hardware devices including storage. Further, since KVM is part of Linux, clients can benefit from the numerous advantages of Linux including lower TCO, more versatility, and support for the widest range of architectures and hardware devices. Moreover, Linux performs, scales, is modular and energy-efficient, is easy-to-manage, and supports an extensive and growing ecosystem of ISV applications.

While we believe that OPEN VIRTUALIZATION holds great promise to address the crises in today’s centers and is a key enabling technology for clients contemplating a transition to cloud computing, its success – and those of the alliance members – will largely depend largely how this new alliance grows and how alliance members can:
 

  • Build a more complete and robust IT ecosystem that includes Independent Software Vendors (ISVs), Systems Integrators (SIs), and other data center/cloud solution providers.
  • Provide a MEASURED MIGRATION http://cabotdatacenters.wordpress.com/2011/05/27/measured-migration-is-smart-for-the-datacenter-and-clouds/ path to existing clients who have substantial IT investments on proprietary virtualization technologies.
  • Deliver differentiated offerings (systems, complementary software, and services) that best address the growing client workloads and data center crises now and in the future.

More proof points for further momentum of this alliance in the future would be the participation of a major ISV or SI as key driving members of this alliance and/or the adoption of OPEN VIRTUALIZATION for mission critical environments at banks or large scale government environments that demand bullet proof security and reliability. We think this will happen – sooner than later as the KVM alliance momentum builds!

In the end, the Open Source (VIRTUALIZATION included) movement has always been about providing clients the flexibility of choice, growth, and customization by avoiding the proprietary traps of vendor lock in; yet maintaining the most stringent enterprise grade requirements of security, reliability, and quality of service!

MEASURED MIGRATION is Smart for the Datacenter and Clouds

Imagine the solar energy needed to convert the earth’s water mass to clouds! Likewise, with legacy IT investments estimated to be in the trillions of dollars in an interconnected global IT environment, the sheer effort to migrate even a modest fraction of these environments to the cloud can be colossal.

Yet in the past few years the predominant debate in enterprise IT seems to be around the rate and pace of the transition to the cloud starting with the need to make the datacenter smart and agile.

While we believe that cloud computing will dramatically impact the way IT services are consumed and delivered in the future, we also believe that this transition must be thoughtful and measured. Companies must have a MEASURED MIGRATION trajectory that is staged to minimize risks and maximize returns. They must take great care to examine which business and IT processes can be migrated with an eye to optimizing their business metrics without assuming needless risk.

Numerous surveys suggest that cloud computing will be over a $100 billion opportunity by 2015 and a large fraction of IT solutions will be delivered over the cloud in the next few years. While we could debate on the precise estimates, we believe that:
 

  • The market opportunity is large with growth rates much faster than the overall IT industry,
  • Private and hybrid clouds will become the dominant cloud delivery models as enterprise workloads begin to leverage the promise of clouds and security concerns persist with public clouds,
  • Before making substantial new cloud investments, businesses will carefully examine the business case that will be primarily driven by their current and future workload needs, and lastly,
  • Customers will rely on cloud providers who have the deepest insights into their workloads and can deliver a broad portfolio of cloud software, services, and systems optimized to these workloads with a MEASURED MIGRATION strategy.

The winners, we believe, will be those IT solution providers who will not only have promising technology solutions in such cloud enabling technologies as virtualization, scalable file systems, end-to-end systems management, etc., but also have a strategic vision and execution path that facilitates this through MEASURED MIGRATION.

IBM in its Systems Software Division which is part of the Systems and Technology Group (STG) is one such large solution provider with an impressive array of over 16 cloud enabling IaaS technologies ranging from virtualization, systems management, to scalable file systems, to high availability and disaster recovery. But more importantly, in recent briefings, we were impressed by the strategy and vision articulated by the leaders of these IBM units. These leaders consistently emphasized the need to build end-to-end solutions and staged engagement methodologies that not only deliver the best in class technology solutions but also help clients with MEASURED MIGRATION as they modernize their datacenters or embark on the transition to cloud computing.

We heard these senior executives articulate the need for IT environments to be “tuned to task”, “optimized through comprehensive systems management”, “staged migration to private clouds and then seamlessly integrated with public clouds to manage spiky workloads”, etc. All this is critical for MEASURED MIGRATION.

In fact, at a later briefing, we learned that IBM has a growing Migration Services group that has grown by almost a factor of 10 in just these past 6 years or so. This “Migration Factory” is, we believe, a major driver of IBM’s substantial recent revenue growth across STG especially in the Linux/Unix market.

With thousands of successful migrations and competitive wins, we believe IBM and its ecosystem partners have the resources and track record to scale this MEASURED MIGRATION to the cloud. It’s a strategy that will ultimately – over the next decade or more – transition a significant part of today’s IT investments on our earth to the clouds!