Simply Pure and Purely Simple – Systems, Stacks and Clouds

I recently attended the IBM IMPACT conference.  During the keynote, an IBM executive remarked (and I am paraphrasing) – It is very hard to make technology simple. This is a very profound observation.

This morning (May/2), comfortably perched in my aisle seat on the plane, as I return back to the wonderful state of Connecticut, several impressions from IMPACT start to swirl in my mind. So I thought I would clarify or purify these impressions and articulate them as simply as I can.  So that I can purge them from my conscious mind-cache yet make them persist in words through this blog post.  

Tomorrow, at the strike of dawn in Connecticut, I want to start with a clear and pure mind and continue with my day job (IT Analyst and Consultant) of gathering and analyzing and articulating additional ideas and thoughts to benefit my paying clients who sponsor my consulting projects. I must do that to make a living and ensure the continuing well-being of my children. This is pure and simple.  

PureSystems   – Patterning and Partnering

IMPACT was quite packed – over 8500 attendees. The solution center opened on April 29 in the evening. I was fortunate to leisurely examine the demos that perked my interest. I spent a significant amount of time examining the various PureSystems exhibits. In particular, I was very impressed seeing the internals of an operating PureFlex system with its dense packaging – servers, storage, and networks.

But I was even more impressed when I spent a significant amount of time with Manhattan Associates – a PureSystems Application Provider partner that focuses on delivering Supply Chain solutions – both for planning and execution. Their very smart and enthusiastic lead technical expert told me that Manhattan Associates has over 15 software products that they have been able to integrate with the PureApplication System using a combination of IBM patterns and Manhattan patterns. This simplicity, he said was a pure delight to clients in retail, logistics, and other areas where optimizing the supply chain is critical for enhancing operational efficiencies.

To date, IBM has over 100 such partnerships and expects to deliver hundreds more similar PurePatterns across various industries with even more partnerships. It will be interesting to quantify the collective ROI that clients receive both in ease of deployment and in ongoing operations from this PureEcosystem.

It is well known that IT operational costs in labor are one of the fastest growing components of the total cost of ownership (TCO).  What PureSystems along with the growing portfolio of PurePatterns do is to tackle this head on to make IT simpler to use –similar to the value proposition for cloud computing which has been front and center in the minds of IT organizations worldwide.  This brings me to my second set of takeaways from IMPACT.

Cloud OpenStack – Molecules Matter

During the IBM Analyst deep-dive sessions, I got the opportunity to understand the scale and focus of IBM’s Cloud OpenStack initiative. One primary motivation behind this open source initiative is to simplify and standardize Cloud Use Cases and Workloads by building a technology stack using open source and standards to instantiate these use cases. IBM used a very nice chemistry analogy to explain this: system components and their functions are like elements in the periodic table while real-life workloads are like molecules that provide higher level business function and are composed of several pre-wired components (elements).

Then we witnessed a very feature rich demo that depicted a fairly comprehensive cloud business use case. This consortium plans to produce many more of these cloud business use cases and members plan to contribute code and other resources to this initiative. In the next few months, IBM plans to work with other consortium members on governance and process related matters in addition to growing this Ecosystem to include more end-users and application providers.

All this will make these molecules matter even more in the industry. It will further that magical chemistry that continues to fuel the Open Source movement that was born at the dawn of the Internet era.  Are we poised to witness another spike in the IT industry with the impending confluence of open source solutions in Big Data, Analytics, and Cloud? The mathematics and technologies for this exist. The bigger question is do we have the knowledge and human capital and the collective wherewithal to leverage all of this? I think so. The first movers have already spoken. The rest will follow. It’s that simple!

OpenStack the PureSystem

IBM was asked several times during the analyst session if there was a plan to extend the Cloud OpenStack to PureSystems. While no formal commitments or announcements were made, I felt from a business strategy perspective, this is a purely simple matter. It will only enhance the PureEcosystem. This chemical bonding will deliver a macro-molecule that could help enterprises deploy clouds in much the same way Enterprise Linux did almost a decade ago. It should also further the collective benefits of Open Source for one and all – pure and simple.

The plane has landed at Westchester Airport. For me the simple act of comfortably flying will now be replaced by the laborious act of having to navigate the traffic on the busy highway (I – 684) that goes north to Danbury, Connecticut. Too bad automobiles are not yet self-navigating and autonomic. But this is bound to happen as cars increasingly become computers in the next decade or so and they get all the intelligent capabilities of sensing and responding in real-time. But this labor of driving through traffic will be very well worth it as I will experience the pure joy of being back at home. After all as a wise man once said Home is where the heart is!  That, my dear friend, is pure and simple!

To extend what the IBM executive said at IMPACT – Let us make technology homely!  We as IT analysts do – in our own small way – contribute to this goal by trying to communicate as best as we can the value of technology in simple terms.

Advertisements
Posted in Uncategorized | 4 Comments

The Pure Thing

Yesterday, I watched from the comfort of my home office, IBM’s PureSystem“Unveiling of a New Computing Era” announcement in New York City. After the initial background business discussion by Mr. Steve Mills – Sr. Vice President and Group Executive for IBM Systems and Software, the curtain was lifted by Mr. Rod Adkins, IBM Senior Vice President, Systems and Technology Group. At that very instant, with a wide grin, Steve made a comment that I am paraphrasing, “Unlike software with systems, you can actually see the real thing”.  When the curtain was lifted, there stood that gleaming blue PureFlex system. This sparked a train of thought that gelled this morning under this spring’s cool Connecticut sun during my customary jog in the park.  

What is a Thing?

During my spare time and sometimes to get a real good night’s sleep, I read. One book that does an admirable and efficient job of “accelerating the time to deep slumber” is entitled “What is a Thing?, by Martin Heidegger, one of the greatest 20th century philosophers. I’ve had the pleasure to be about 1/3 the way.

But this morning, reflecting on Steve’s comment, I thought: What is IT (information technology) today? Why is the word “Pure” so relevant? What does this all mean? This created an energizing stream of shedding “thought-vortices” whose trajectories like their fluid mechanics counterparts are difficult to model and predict much less tame and transcribe. But here is where Martin and some reflection come to rescue.

Material and Abstract Things

You see – a system like the PureSystem is something that you can see, touch, and feel. It’s a material thing. Data (even BigData) you cannot see, touch, or feel. But then you can visualize data through software. Software is not a material thing (actually like data it is an abstract thing) but it makes a material impact especially when grounded and optimized on a material thing like a system and then used to solve a business or scientific problem.

Likewise, mathematics (one of the most abstract things) has its profound impact when its purest form is applied to solve the challenging problems of the day especially those that have a material impact, for example, the impact of shedding vortices on aircraft operating performance or the calculation of the best available airfare between two cities. All this is of course done in software that runs on a system.

The Everything and Nothing Route to Profound and Pure Insights

But perhaps the most abstract thing, philosophy, and the philosophers who pursue these thought-vortices may take “this thing (whatever that thing is)” and argue that it’s nothing.  Just as their other philosopher colleagues could argue that it is everything. That is the duality of zero and infinity. For instance, the great Greek philosopher Aristotle was once asked how he was able to come up with such profound insights. It is rumored that Aristotle answered that he sat in a room and opened all the windows and an avalanche of thoughts came flying into his head which he then curated and came up with profound “pure” insights. That’s BigInsights from BigData.

Contrast this with Buddha who sat in total isolation and completely “emptied” his mind of all thoughts and meditated and came up with yet another set of profound and “pure” insights. That’s starting with a “clean slate”.

The surest (and perhaps purest) thing that I think I know is that I am. But do I really know that? That’s an entirely new and different question for another day.

When Aristotle (West) Meets Buddha (East) in the Cloud

I was told during the IBM PureSystems announcements that IBM worked on this initiative over the last four years; taking input and learnings from thousands of client engagements around the world and came up with this highly optimized, cloud and analytics ready family of systems and platforms. I was also told that the technical architects started with a “clean slate”. That is like marrying East with West to get the best of both worlds and this should be great for clients everyware!

Now lest I get fired from my day job of doing IT analyst work, I must move on to my next “thing” which is finishing up the white paper that my employer wants me to write! That is very material my dear friend! To my children – who read my blogs – note your survival and well-being depend on my finishing this next thing!

Posted in Uncategorized | Leave a comment

The Taming of Data – On the Value Train from Insights to Knowledge to Wisdom to perhaps Happiness?

I recently attended the IBM #SmarterAnalytics Summit in New York City that focused on #Analytics and #Optimization. The sessions and the client panel in particular were superb and enlightening. Beyond, the typical discussions on technology, the IBM client panel repeatedly emphasized that organizational and cultural changes were critical to properly implement and integrate #Analytics and #Optimization as core business process.

This re-sparked a train of thoughts in my mind. I even got to test these thoughts a bit later at the evening reception. On my train ride back home, this train of thoughts on how to tame this avalanche of data for mankind’s (including corporations) benefit continued to escalate. I thought I should transcribe this train of thoughts quickly before it crashes and bursts into some forgotten cloud! For this my Cloud Mobile (iPhone) with Speech Recognition Software (Dragon) came to my rescue.

On Data, Words and Deeds, and Ephemeral Social Media

It’s well recognized by IT industry experts that data by itself has little value. It’s what you do with it that generates the value. It reminds me of Lech Walesa’s quote The supply of words in the world market is plentiful but the demand is falling. Let deeds follow words now.”  Or simply put, in an anonymous quote, “talk is cheap because supply exceeds demand”.

I am not suggesting that we clamp down on the supply of words. That would be tantamount to curtailing free speech. We must take a thoughtful approach and critically examine the hype around #Bigdata – primarily perpetuated by the IT industry for which, as an analyst, I am also guilty.

Also guilty – contributing to the excess supply of data – is the recent spate of growth of “unstructured” data: images, video, voice, pictures and others. Probably because many believe that “a picture is worth a thousand words”. And a video even more! Every time I hear this oft used cliché, I think REALLY? WHY? Why are we creating all these quantities of image/video data and spending our precious resources (our time) doing so? More importantly, why are we so enamored with transmitting this data to others?

Yes, new social media and the underlying technologies give each and every individual enormous capability for creative expression and even contribute to the overthrow of oppressive regimes e.g. the Arab Spring. But aren’t we collectively trampling on another form of creative expression – the thoughtful reflective kind by drowning each other in all this data? Or aren’t we being distracted by all these images that like fast foods fill us up to sated exhaustion but have very little nutritional value?

But Some Words do Matter. Some Words are better than Exabytes of Pictures (or Words) and they Persist!

Here are some poignant examples. This is what the great contemporary Scandinavian poet, Tomas Tranströmer (translated by Robin Robertson), wrote about words:

FROM MARCH 1979

Sick of those who come with words, words but no language,

I make my way to the snow-covered island.

Wilderness has no words. The unwritten pages

Stretch out in all directions. 

I come across this line of deer-slots in the snow: a language,

language without words.

And the great 20th century Mexican poet, Octavio Paz (translated by J. M. Cohen), wrote:

CERTAINTY

If the white light of this lamp

is real, and real

the hand that writes,

are the eyes real

that look at what I write?

 

One word follows another.

What I saw vanishes.

I know that I am alive,

and living between parentheses.

Distinctive Numbers – God’s Equation Then and Now – Hey It’s All Just Zeros and Ones

Just like profound and wise words, there are some distinctive numbers (data) that also matter: zero and the imaginary number i and those irrationals Pi and e. And then there’s Euler’s God’s Equation of centuries back:  e ^ i2π = 1. Thus, the “simplest” and most fundamental of all numbers (Numero Uno) is incredibly complex, made up of irrational, transcendent constants that extend to infinity. Now, the more contemporary version of God’s Equation (circa 2007) is the fourth album by the Norwegian progressive metal band Pagan’s Mind and contains video clips! But hey, today it’s all just digital data which are, at the end of the day, zeros and ones – the two most fundamental numbers. So why are we all making such a hoopla!

Because we must traverse that Divine Manifold from Data to Information to Insights to Knowledge to Wisdom and perhaps Happiness

Data is plentiful (all the data generated today can’t even be stored!), and left untamed is bound to be catastrophic. So we (corporations included) must harmonize all our assets and capabilities (people, process, data, technology, and culture) to navigate through this data onslaught and traverse the Value Train with the help of yet another God’s Equation: This new equation must transform Data to Information to Insights to Knowledge to Wisdom. One recent noteworthy technology asset for this journey to wisdom could be #IBMWatson.

That great wise soul, Mahatma Gandhi, once said: “Happiness is when what you think, what you say, and what you do are in harmony.” So the Happy (and Wise) enterprises of the future in our data-driven world will be those that can act and culturally transform themselves through change and a complete re-think of strategy – just like the IBM client panel repeatedly emphasized – those were divine words! And they matter! Act on them! The customer is always right!

Posted in Uncategorized | 1 Comment

Tracking an IT Analyst’s Journey on the Cloud Mobile: My Musings after Attending IBM Pulse 2012.

The one key takeaway for me from the conference was IBM’s message on the confluence of private secure clouds to support an increasingly mobile world – employees, clients, partners, and other stakeholders. This is the Cloud Mobile (think Snow Mobile). It has to be safe, secure, and comfortable yet must perform, scale and deliver a high quality of service. IBM unveiled a set of solutions to support this vision and you can get all the detail from IBM Pulse 2012.

Very early morning on March/7 at the hotel, after a shower, I turned on the TV news. I heard that Apple planned to announce the iPad3 later in the day probably with the same pricing as the previous iPad2. It was expected that the price for the iPad2 would be reduced. This irked me as I had just bought an iPad2 a few weeks before. But I quickly got over that as I had already obtained significant business value from my iPad2 investment. This iPad2 and my iPhone4 are my most valuable mobile devices.  This fact would be further reinforced as that day’s events unfolded.  While checking out of my hotel to return home to Connecticut, a train of thoughts on my Cloud Mobile began to evolve in my mind that I want to share with you.

The Cloud Mobile has enhanced many professional pursuits in differing ways

Gone are the days when Mathematics was largely a solo-sport and the primary tools were just paper, pencil, extraordinary rigor, and amazing individual imagination.  In recent years, with the advent of the Internet and an unusual level of collaboration among mathematicians, it has increasingly become a team sport with just as much rigor and an even greater and more amazing group wisdom and imagination.

This has greatly advanced innovation and discovery in Mathematics even in such arcane areas as Number Theory that was once the province of individual brilliance. In fact, the famous Fermat’s Last Theorem was finally proved by Andrew Wiles in 1995 after centuries of sustained collaboration and inquiry.  And, yes computers were partly used as tools to arrive at this result just as they were largely used to resolve the Four Color Theorem in 1976.  With cloud computing, this level of collaboration will only increase. But then, in some sense, Mathematicians have always been on the cloud!

Painting/art is still largely a solitary activity with less technology impact. Yes there are new artistic areas impacted by technology and graphics but the most creative artists and painters still rely only on their traditional tools – canvas, paint, rigorous techniques, and an amazing imagination. And yes artists and painters are notorious for their nomadic and mobile lifestyles. They too have always been on the cloud!

While writers continue to primarily work solo, there is an increasing trend for them to work in groups particularly when creating complex technical or non-fiction content. Markup capabilities in modern word processors and capabilities in Google Docs further facilitate these group efforts particularly in the cloud!

But while painters and writers both possess amazing creative capabilities, they differ in at least one way – Can you imagine a painter giving up his/her brush to a collaborator to markup on his/her evolving work of art?! 

Technologists and Engineers tend to innovate better in groups and through collaboration. In fact, the industrial revolution and the subsequent rise of today’s large corporations depended heavily on this group collaboration. Today, this collaboration extends to other stakeholders including suppliers, customers, investors, and business partners. And Engineering Clouds are being adopted in the Manufacturing industry to improve productivity in design and development. So they too are getting on the cloud!   

My current profession – an IT Analyst – is a blend of several of the above professions. IT Analysts must possess the analytical rigor of the mathematician, the conceptual creativity of an artist, the story telling capabilities of a writer, and the knowledge of the technologist. Add to these, the experiences of a business professional – marketing, sales, management, etc. So naturally, IT Analysts should also benefit from the cloud! 

How on March 7, the Cloud Mobile helped this IT Analyst

Just as I was finishing up my breakfast at the MGM Grand with some of my colleagues, I saw a missed call from one of my key clients responsible for Business Analytics. So as I took the cab to the airport, I called him back.  He wanted to get an estimate of the size and growth of data in the financial services industry particularly financial markets. He had a good estimate of the total size and growth across all industries and had tried some internal sources but did not have an estimate for his particular area.  I told him that I was on the road and will try to do some investigation and get back to him the following day.

Now my firm, unlike some other major analyst firms, does not routinely provide these types of market estimates. There are other firms that specialize in these studies and make these reports available to their clients. I do not have access to these reports. But often, there is a lot of information on the web that one could often piece together to arrive at an informed estimate to such questions. So after checking in at the airport, I pulled out my iPad2 – fortunately the airport had free Wi Fi access – and began searching the web. After about ½ an hour, I had some relevant pieces of helpful information but was still nowhere near an estimate. I was a little disappointed and was almost planning to give up temporarily.

But then suddenly, I remembered that I had downloaded a very comprehensive Big Data report written by a major Global Think Tank in 2011. This report was on my secure private storage cloud. I had always planned to read it but never got the time to do so. So with my iPad2, I connected to my secure private cloud (protected by two levels of security), and pulled in the report into my iPad2’s iBook format.  Then I boarded the plane. And as the plane soared up above the clouds and the flight attendant announced that we could turn on electronic devices, I opened up the iPad2 and began reading the report.

In that report, after about three hours, I found the missing pieces of information in various places. Not only did I find the missing links to provide my client with an informed estimate, but I also read through this comprehensive Big Data report and was completely oblivious to the uncomfortable middle seat that I was sitting on.  Now that’s a ton of business value made possible by the cloud!

The plane landed at Charlotte, NC where I had to transfer to White Plains, NY. I was keen on composing the email to my Business Analytics client summarizing how I had arrived at the informed estimate and the rationale. But I got hungry. So I had a nice hot and spicy Mexican meal at Tequileria at the Charlotte airport. After the meal, I boarded my next flight and slept through the short flight to my destination. The next morning, I sent the email to my client with the informed estimate and rationale.

The Advantages of a Private Cloud Mobile

IBM’s notion of providing clients capabilities to build and deploy secure private clouds and connect as needed to hybrid clouds should help security (but also very cost) conscious enterprise executives make the transition to the cloud to support their very talented mobile workforce. Beyond, the obvious transactional mobile use cases i.e. procurement, sales force automation, invoicing that improve operational efficiency, the Cloud Mobile can (as depicted in my own personal use case) facilitate a level of analysis, collaboration, productivity and innovation, that can be a source of significant competitive advantage for enterprises while nurturing their talented mobile knowledge workers.

There’s a reason why I did not put that Big Data report on a public cloud i.e. Apple’s free iCloud service. These reports and other similar content are my sources of competitive advantage and differentiation. I like to keep these secure and private and protected through several layers of security yet accessible on demand. Also, through this private cloud, I can regulate access to my many collaborators in the cloud!

Back to the Cloud Mobile. The Music and the Pulsating Moves at Pulse 2012 and More.

Maroon 5’s concert at Pulse 2012 indeed made the Cloud Mobile move like Jagger! This built on some amazing fluid cloud like dance moves we witnessed earlier in the day by a group called iLuminate. Musicians and performers too are on the cloud! Performers collaborate and rehearse constantly. They are constantly on the road and mobile.  And while there are individual superstars, there is nothing like listening to a well-coordinated talented group either at a concert or in your Cloud Mobile (Automobile).

This weekend the weather was perfect in Connecticut. I had the great joy and pleasure of taking my younger twin son to his choir performance and concert in my Cloud Mobile (Car). Then we all witnessed the lovely performance of his dedicated choir culminating after weeks (and weekends) of group rehearsals and practice. It put this parent on the Cloud! And that feeling even the best IT Analyst can’t analyze! It can only be experienced – in the cloud!

Posted in Uncategorized | 3 Comments

The Strategic Importance of Technical Computing Software

Beyond sticking processors together, Sticky Technical Computing and Cloud Software can help organizations unlock greater business value through automated integration of Technical Computing assets – Systems and Applications Software. 

Most mornings when I am in Connecticut and the weather is tolerable, I usually go for a jog or walk in my neighborhood park in the Connecticut Sticks. One recent crisp sunny fall morning, as I was making my usual rounds, I got an email alert indicating that IBM had closed its acquisition of Algorithmics – a Financial Risk Analysis Software Company and this would be integrated into the Business Analytics division of IBM. This along with a recent (at that time) announcement of IBM’s planned acquisition of Platform Computing (www.ibm.com/deepcomputing) sparked a train of thoughts that stuck with me through the holidays and through my to-and-fro travel of over 15,000 miles to India and back in January 2012. Today is February 25, 2012 – another fine day in Connecticut and I just want to finish a gentle jog of three miles but made a personal commitment that I would finish and post this blog today. So here it is before I go away to the Sticks!

Those of you who have followed High Performance Computing (HPC) and Technical Computing through the past few decades as I have may appreciate these ruminations more. But these are not solely HPC thoughts. They are, I believe, indicators of where value is migrating throughout the IT industry and how solution providers must position themselves to maximize their value capture.

Summarizing Personal Observations on Technical Computing Trends in the last Three Decades – The Applications View 

My first exposure to HPC /Technical Computing was as a Mechanical Engineering senior at the Indian Institute of Technology, Madras in 1980-1981. All students were required to do a project in their last two semesters. The project could be done individually or in groups. Projects required either significant laboratory work (usually in groups) or significant theoretical/computational analysis (usually done individually). Never interested in laboratory work, I decided to work on a computational analysis project in alternate energy. Those were the days of the second major oil crisis. So this was a hot topic!

Simply put, the project was to model the flame propagation in a hybrid fuel (ethanol and gasoline) internal combustion engine using a simple one dimensional (radial) finite-difference model to study this chemically reacting flow over a range of concentration ratios (ethanol/gasoline: air) and determine the optimal concentration ratio to maximize engine efficiency . By using the computed flame velocity, it was possible to algebraically predict the engine efficiency under typical operating conditions. We used an IBM 370 system and those days (1980-1981) and these simulations would run in batch mode in the night using punched cards as input. It took an entire semester (about four months) to finish this highly manual computing task for several reasons:

  1. First, I could run only one job in the night; physically going to the computer center, punching the data deck and the associated job control statements and then looking at the printed output the following morning to see if the job ran to completion. This took many attempts as inadvertent input errors could not be detected till the next morning.
  2. Secondly, the computing resources and performance were severely limited. When the job actually began running, often it would not run to completion in the first attempt and would be held in quiescent (wait) mode as the system was processing other higher priority work. When the computing resources became available again, the quiescent job would be processed and this would continue multiple times until the simulation terminated normally. This back and forth often took several days.
  3. Then, we had to verify that the results made engineering sense. This was again a very cumbersome process as visualization tools were still in their infancy and so the entire process of interpreting the results was very manual and time consuming.
  4. Finally, to determine the optimal concentration ratio to maximize engine efficiency, it was necessary to repeat steps 1-3 over a range of concentration rations.

By that time, the semester ended, and I was ready to call it quits. But I still had to type the project report. That was another ordeal. We didn’t have sophisticated word processors that could type Greeks and equations, create tables, and embed graphs and figures. So this took more time and consumed about half my summer vacation before I graduated in time to receive my Bachelor’s degree. But in retrospect, this drudgery was well worth it.

It makes me constantly appreciate the significant strides made by the IT industry as a whole – dramatically improving the productivity of engineers, scientists, analysts, and other professionals.  And innovations in software, particularly applications and middleware have had the most profound impact. 

So where are we today in 2012? The fundamental equations of fluid dynamics are still the same but applications benefiting industry and mankind are wide and diverse (for those of you who are mathematically inclined, please see this excellent 1 hour video on the nature and value of computational fluid dynamics (CFD) –http://www.youtube.com/watch?v=LSxqpaCCPvY ).

We also have yet another oil crisis looming ominously. There’s still an urgent business and societal need to explore the viability and efficiency of alternate fuels like ethanol. It’s still a fertile area for R&D. And much of this R&D entails solving the equations of multi-component chemically reacting, transient three dimensional fluid flows in complex geometries. This may sound insurmountably complex computationally.

But in reality, there have been many technical advances that have helped reduce some of the complexity.

  1. The continued exponential improvement in computer performance – at least a billion fold or more today over 1981 levels – enables timely calculation.
  2. Many computational fluid dynamics (CFD) techniques are sufficiently mature and in fact there are commercial applications such as ANSYS FLUENT that do an excellent job of modeling the complex physics and come with very sophisticated pre and post processing capabilities to improve the engineer’s productivity.
  3. These CFD applications can leverage today’s prevalent Technical Computing hardware architecture – clustered multicore systems – and scale very well.
  4. Finally, the emergence of centralized cloud computing (http://www.cabotpartners.com/Downloads/HPC_Cloud_Engineering_June_2011.pdf ) can dramatically improve the economics of computation and reduce entry barriers for small and medium businesses.     

One Key Technical Computing Challenge in the Horizon

Today my undergraduate (1981) chemically reacting flow problem can be fully automated and run on a laptop in minutes – perhaps even an iPad. And this would produce a “good” concentration ratio. But a one-dimensional model may not truly reflect the actual operating conditions. For this we would need today’s CFD three dimensional transient capabilities that could run economically on a standard Technical Computing cluster and produce a more “realistic” result. With integrated pre and post processing, engineers’ productivity would be substantially enhanced. This is possible today.

But what if a company wants to concurrently run several of these simulations and perhaps share the results with a broader engineering team who may wish to couple this engine operating information to the drive-chain through the crank shaft using kinematics and then using computational structural dynamics and exterior vehicle aerodynamics model the automobile (Chassis, body, engine, etc.) as a complete system to predict system behavior under typical operating conditions?  Let’s further assume that crashworthiness and occupant safety analyses are also required.

This system-wide engineering analysis is typically a collaborative and iterative process and requires the use of several applications that must be integrated in a workflow producing and sharing data. Much of this today is manual and is one of today’s major Technical Computing challenge not just in the manufacturing industry but across most industries that use Technical Computing and leverage data. This is where middleware will provide the “glue” and believe me it will stick if it works! And work it will! The Technical Computing provider ecosystem will head in this direction.  

Circling Back to IBM’s Acquisition of Algorithmics and Platform Computing

With the recent Algorthmics and Platform acquisitions, IBM has recognized the strategic importance of software and middleware to increase revenues and margins in Technical Computing; not just for IBM but also for value added resellers worldwide who could develop higher margin services in implementation and customization based on these strategic software assets. IBM and its application software partners can give these channels a significant competitive advantage to expand reach and penetration with small and medium businesses that are increasingly using Technical Computing. When coupled with other middleware such as GPFS and Tivoli Storage Manager and with the anticipated growth of private clouds for Technical Computing, expect IBM’s ecosystem to enhance its value capture. And expect clients to achieve faster time to value!

Posted in Uncategorized | Leave a comment

No Apology for High Performance Computing (HPC)

A few months back, at one of my regular monthly CTO club gatherings here in Connecticut, an articulate speaker discussed the top three IT trends that are fundamentally poised to transform businesses and society at large. The speaker eloquently discussed the following three trends:

  • Big Data and Analytics
  • Cloud Computing
  • Mobile Computing

I do agree that these are indeed the top three IT trends in the near future – each at differing stages in adoption, maturity and growth. But these are not just independent trends. In fact, they are overlapping reinforcing trends in today’s interconnected world.

However, while discussing big data and analytics, the speaker made it a point to exclude HPC as an exotic niche area largely of interest to and (implying that it is) restricted to scientists and engineers and other “non-mainstream” analysts who demand “thousands” of processors for their esoteric work in such diverse fields as proteomics, weather/climate prediction, and other scientific endeavors. This immediately made me raise my hand and object to such ill-advised pigeon-holing of HPC practitioners – architects, designers, software engineers, mathematicians, scientists, and engineers.

I am guilty of being an HPC bigot. I think these practitioners are some of the most pioneering and innovative folk in the global IT community. I indicated to the speaker (and the audience) that because of the pioneering and path breaking pursuits of the HPC community who are constantly pushing the envelope in IT, the IT community at large has benefited from such mainstream (today) mega IT innovations including Open Source, Cluster/Grid computing, and in fact even the Internet. Many of today’s mainstream Internet technologies emanated from CERN and NCSA – both organizations that continue to push the envelope in HPC today. Even modern day data centers with large clusters and farms of x86 and other industry standard processors owe their meteoric rise to the tireless efforts of HPC practitioners. As early adopters, these HPC practitioners painstakingly devoted their collective energies to building, deploying, and using these early HPC cluster and parallel systems including servers, storage, networks, the software stack and applications – constantly improving their reliability and ease of use. In fact, these systems power most of today’s businesses and organizations globally whether in the cloud or in some secret basement. Big data analytics, cloud computing, and even mobile/social computing (FaceBook and Twitter have gigantic data centers) are trends that sit on top of the shoulders of the HPC community!

By IT standards, the HPC community is relatively small – about 15,000 or so practitioners attend the annual Supercomputing event. This year’s event is in Seattle and starts on November 12. But HPC practitioners have very broad shoulders and with very keen and incisive minds and a passionate demeanor not unlike pure mathematicians. Godfrey H. Hardy – a famous 20th century British mathematician – wrote the Mathematician’s Apology – defending the arcane and esoteric art and science of pure mathematics. But we as HPC practitioners need no such Apology! We refuse to be castigated as irrelevant to IT and big IT trends. We are proud to practice our art, science, and engineering. And we have the grit, muscle and determination to continue to ride in front of big IT trends!

I have rambled enough! I wanted to get this “off my chest” over these last few months. But with my dawn-to-dusk day job of thinking, analyzing, writing and creating content on big IT trends for my clients; and with my family and personal commitments, I have had little time till this afternoon. So I decided to blog before getting bogged down with yet another commitment. It’s therapeutic for me to blog about the importance and relevance of HPC for mainstream IT. I know I can write a tome on this subject. But lest my tome goes with me unwritten in a tomb, an unapologetic blog will do for now.

By the way, G. H. Hardy’s Apology – an all-time favorite tome of mine – is not really an apology. It’s one passionate story explaining what pure mathematicians do and why they do it. We need to write such a tome for HPC to educate the broader and vaster IT community. But for now this unapologetic blog will do. Enjoy. It’s dusk in Connecticut. The pen must come off the paper. Or should I say the finger off the keyboard? Adios.

Posted in Uncategorized | Leave a comment

The US Healthcare System – One Big Tax on the Economy – Beyond Costs and Operational Efficiencies – Innovation is Critical – Technology Helps.

 

It’s well known that the US Healthcare costs are skyrocketing. Estimates range from 15%-20% of US GDP – greater than any other developed nation in the world. Left unchecked, this will be a big burden that today largely falls on US employers and businesses. And these businesses have to pass on these costs to their customers, making them cost uncompetitive in an increasingly globalized world. I found the following recent articles very illuminating in describing the challenges in US Healthcare and the implications of globalization:

  1. The Big Idea: How to Solve the Cost Crisis in Health Care, Robert S. Kaplan and Michael E. Porter, Harvard Business Review, September 2011.
  2. The Risks and Reward of Health-Care Reform, Peter Orzag, Foreign Affairs, July/August 2011.
  3. How America Can Compete – Globalization and Unemployment, Michael Spence, Foreign Affairs, July/August 2011.

But the big question is what each of us can do individually, collectively in an organization, and in our ecosystem across organizations – nationally and globally.

On a recent weekend, on October 1, I attended a talk by Dr. Atul Gawande sponsored by the New Yorker magazine and IBM. This was preceded by an exclusive breakfast meeting with Atul. I was fortunate to be invited and I thank IBM for a very gracious invitation to this event hosted by Dr. Paul Grundy of IBM who is also President of Patient-Centered Primary Care Collaborative. At breakfast, I also got to spend some quality time with the publisher of the New Yorker and other doctors (all medical – not like the Poor Hungry Doctor (Ph. D.) kind, like yours truly!) who are all facing these challenges of the US Healthcare system.

During the breakfast event and the subsequent talk, much of the emphasis was on reducing costs and improving operational efficiencies in the US Healthcare system. Dr. Gawande was very effective in conveying his path breaking ideas on how checklists and coaching can greatly improve a surgeon’s performance and result in far better patient outcomes.

Dr. Gawande started with the premise that we all reach a plateau at one point or the other in our lives and careers. And as we push ourselves to become better at what we do, the marginal benefits of our efforts seem to be all for naught. So what can we do? How can we increase our operational efficiency? His recipe marries continuous learning with coaching.

I encourage everyone interested in this subject to read his recent article in the New Yorker and also his book on checklists. His book also covers other professions beyond surgeons including architects, athletes, etc. It stresses that in-addition to continuous learning throughout one’s life, a coach is an essential partner for continuous self-improvement in any profession particularly those that are knowledge based. This clearly includes mine – an Information Technology (IT) analyst and entrepreneur.

As IT professionals, our lives have become complex and is today’s harsh reality. We all have to do more with less as we all have less time and leaner budgets.  And yet we also have to do more with more as we are drowned in data, interruptions, and regulations.  This more or less is driving us nuts. Everything is escalating at a frantic rate and pace while margins continue to dwindle.  We are constantly challenged to improve every day operationally in what we do.

Part of the problem is IT itself. IT in some ways has caused this problem and I think IT is also part of the solution. I constantly ask myself these reflective questions: Is speed a virtue? Is Big Data really that useful? Is constant improvement always better?  I think the answer to these questions is the proverbial “Yes and No” which drives me further nuts. Being an engineer, I like the determinism of a precise unambiguous answer. I like the precision of checklists but clearly also appreciate the value of coaching! So it is Yes and No for me now on these philosophical issues.  

While IT has made a very positive impact on improving the operational efficiencies of healthcare, also required are process innovations (some IT-enables and others require business incentives). In fact, in response to a question from the audience, Atul gave an example of how a surgeon in his hospital was able to take a standard but lower cost surgical gauze and then cut it so that it would be better fit for purpose or tuned to task rather than using the more expensive pre-cut gauze. This adjusted process was then adopted by several surgeons in the hospital resulting in substantial savings in operational costs while improving patient outcomes. This was clearly a business process innovation!

But IT must itself be tuned to task and fit for purpose. In short IT must become Smarter. It’s what IBM calls Smarter Computing. With Watson and other related Smart IBM efforts and with fostering collaboration the healthcare ecosystem (Dr. Grundy’s efforts), IBM is providing the incentive and impetus needed to help address the challenges with the US Healthcare system. With events such as the one on Oct/1, IBM and its partners are providing the mentoring and coaching for everyone touched by the healthcare system!

Posted in Uncategorized | Leave a comment