The new IT Landscape


A view from the Infrastructure

In the last 15 years the Infrastructure landscape was defined by demands of the business. This will of course not change. However the approach that one business line demands middleware X another middleware Y will stop. There is a profound reason for that.

In the last couple of years the physic run the Infrastructure has dramatically comodiasied. This has reach a point where the saving for large enterprise no

Featured image

longer get in significant dimensions. The efficiency thru Server Virtualization and nowadays Storage Virtualization has reached in some enterprises more than 80%. With new storage and server orchestration layers and additional concepts like the enterprise hybrid cloud (EHC) this can be tweaked more, but needs first a different approach to the IT operation.

Key here is private cloud, which is similar to the public could offerings, of course on premise.

So what is the catch?

Mainly the operation. In the traditional datacenter, many enterprises and global operational IT departments have build a structure to map the silos approach of the LoB (Line of Business). You will find functions focused on Server, Storage, Networking, Databases, Middleware etc. Each of them have coordination functions with the LoB and cross functional sections. Lost of talks I have with those entities in the IT department always claim that they can do that better than external companies like VCE, which offers converged Infrastructure. Also many of them hide behind the “vendor-lock-in” argument.

On the other side we see that this cost the companies a fortune. Often this IT departments cover 70% of their cost with this, or the other way they can save a lot of that.

What has changed ?

With the concept of “as-a-service”, IT has the ability to automate many tasks and build a software layer as the final governance. With new concept of SLA build into the Software defined components IT personal no longer has to pan, define think-about and run it. Combined with the Converged Infrastructure and the possibilities of Software defined it changes the silos approach to an more holistic view of the datacenter. This does not only save cost and transport test and development of the infrastructure back to the vendor, it also allows higher integration of resources to drive more efficiency.

How does LoB react ?

Often they already there. With offerings of a public cloud the development of new software happens in this organizations often without the IT department involvement. This is a major concern of the CIO and CDO which I here very often. LoB´s look at the business outcome, they have alternatives to the internal IT now and they move off.

So what is next?

From ym view a lot will come in to analyze of the current state of the IT department and how mature this is already in the as-a-service transformation. There are various of offerings like the IT Transformation Workshop of EMC to define and reshape the IT landscape. Have a look at that.

So what with the applications?

Not so simple. There will be three types of applications found in many of the enterprises.

Applications which only deliver information, exist because of historical reasons. Others are monolithic large Enterprise Apps, like SAP, or Oracle Applications the thrid one are new apps for the new business lines touching Web, Mobile, social and cloud.IT-Transformation-Storymap1

For the first, I would retire them and replace that by a database delivering the results. Maybe there are apps no longer used, but nobody realize that? Shut them down. The 2nd kind is more tricky, and have to looked at case by case build a migration strategy and this may take some mont/years. The last I would put immediately on the new concept of Infrastructure.

So what is the key characteristics of this infrastructure?

Automation and orchestration, comodization and standardization. To drive more cost out of the IT the next generation of architecture have to follow this rules. More that that it has to build an independent layer between the physic and the applications. An interface between the resources and the applications. Efficiency and time to provisioning can be only gained with automation. Modern architecture drive provisioning down from weeks to days or even hours, defining the SLA and report back the cost of the selected SLA`s. Also it reports back whether a service breached the SLA or has performed in the payed and agreed parameters.

Finally all this journey start with the ability of the IT department to change and understand the journey of the private cloud.

Image courtesy of pro-physic.de, EMC Corporation

Read More:

http://itblog.emc.com/category/it-transformation/

https://blogs.vmware.com/cloudops/it-transformation

The Human Body, the most advanced factory – the real big data


Since mankind was self aware we seek to understand how the human body and functions work and how the soul is integrated in the whole system. It has taken in the early days of earth to now to optimize the factory and develop higher function, which

we like to call consignees, brain, and social empathy. The human ecosystem in in essence a factory. In numbers, every day around 20 Billions of cells will be replaced by new ones out of the 10.000 Billions we have. This means that each 10 years your body is rebuild. This cost energy in addition we also loose energy, around 50 to 360 Watt for keeping the factory running. By the way most fitness relevant trackers take this in consideration. On a daily basis this will add up to around 2,9 kWh. In the case of heavy work or sports this will go up of course.

Comparing the DataCenter with our body, we will find astonishing parallelism, like the nerve system and the network, like the blood and the power, like the heating and cooling. This comes from the same physics we and the DC act in. If we dig into that, we will see that the nature has solved most of the demands with much more creativity. Also we find dedicated systems, which autonomous action to keep the fabric running. I think on the limbic system

EMC DC Durham

EMC DC Durham

and our ability to react on external wounds without having a big escalation. Root cause is done on the fly for the minor issues.

A little more details on this fascinating comparison you can read in the next episodes:

  • Episode I: The Human Body a optimized fully automated factory
  • Episode II: The Blood in the human factory
  • Episode III: The Sensory
  • Episode IV: The Big Data approach in the Human factory
  • Episode V: The Control in the human Factory
  • Episode VI: The CyberControl in the Human factory
  • Episode VII: The Chain of Command in the Human Factory
  • Episode VIII: Automation in the human factory
  • Episode IX: Energy consumption model in the human factory
  • Episode X: Influence of the Soul in the digital Factory
  • Episode XI: Final thoughts on the digital factory

What kind of Analytics ?


In various discussion with my customers and colleagues. I experienced a very controversy discussion around analytic.

Some understand the „what has happens,” also called root cause, others want to predict the future, meaning „what will happen,” a third group tries to answer the question, „what could happen“. Funny enough often is the answer a combination out of a combination of these. When it comes to our very fast decision making and consuming society this can cause some friction.
In the first case, lets call it descriptive analytic, it is the reason many companies invest money to avoid that, but after the incident, not predicted, happens.2434048_300dpi

 

 

Often you find it difficult to get to the root cause, since simple the data or information is no longer available to get to the bottom. In the 2nd case, statistical model show the business based on historical information what can get wrong, lets call it predictive analysis. In the last case, data scientists use machine learning to find the possible future, based on different information streams, it is known as prescriptive analytic. Like in chess where in the first move all possible is and there some prediction could be made, based on the information of the players, over time when more moves have taken the information and moves getting clearer.

 

The human brain is capable of doing this kind of prescription on the fly, we call that experience. By the way a reason why often young managers or entrepreneurs fail, but this is worth another blog. The key is here that the brain has build, by and expert, enough information and channels that the brain could do both predictive and prescriptive.
Having the scientists build the model, means the mathematical representation of the information combination, it can lead to a variety of possible outcomes, we call the data science or machine learning. Key is the the amount of information, it deep and historical very long. This is also on of the boundaries.

 

Many CIO and CTO from companies I talk today do not keep the data, for many reasons, mostly cost. So what can a Data Scientist than do ? Simple he build the model and than run it. Over the time it will be better, preserving the information. Like in the chess where you can see the extreme professional players analyze and predict movements in depth untrained could not imagine , or even provoke movements. This is not yet in this science embedded.

 

A couple of week ago I was fortunate to see a start-up and talk to the founders. It was all about customer intimacy and combining information in a company to serve a better customers relationship. I would drive this one dimension deeper. Why not using this as an internal knowledge base. Let´s predict what the corporate user needs to find around a specific subject, not only internal sources also external can be used here.

 

I am still wondering why HR representatives and headhunter not use predictive analysis to identify the right candidate for a job?
Yes you’re right the information in a company is often so filed that we will not have the possibility to combine them. Here we go with the concept of a data lake, building one repository of information and use it for internal and external benefit.

prop-jet engine

However the last topic is the „what could happen“ case, the descriptive Analytics. This is all about possibilities and often risks. I guess this is used in Six Sigma, and also man kind is used to do that. The mother warns the child about the possible accident, when walking on the street. The company strategic plans is all about that scenarios. I think this method means „I do not have enough data to be more precise. „

 

So what to do? Keep all the data, maybe randomized and anonymized, but keep it, because you never know what business can be build out of the treasure the company has generated over years.

 

Welcome in the Age of Information !

 

 

 

 

Disrup the Disruptor


Platform

Platform (Photo credit: Geir Halvorsen)

In one of the recent interview the CEO from Pivotal, Paul Maritz, stated that he wants to disrupt the disrupter. The AWS platform offering is addressing a majority of the public cloud demands today and Pivotal wants to play here strong too.

So what is the disruption AWS is addressing?
“Fast”, DataCenter by credit card, to name one. “Easy” maybe another. “Cheap” is also stated. All this has nothing to do with a conventional DC, more the opposite, slow, complex and expensive is more what CIO´s today see in their environments. This is the disruption, there is only one missing piece, “Trust” or “Risk”. Is this not what DC are build for?
Let´s circle back in time, where the Mainframe was born, lets call that a platform one. The majority of applications written there have been determined the production systems. Of course there have been litte users and litte amount of demands, but the origin was done thru the developers.
History has shown that there was only one player left after 30 years. The programs still running and working, but hard to maintain and not able to be inline with current standards of customer experience.
With the age of the PC and more with appearance of Linux and x86 architecture the second platform was born, we called that client-server architecture. Also here the developers led the way with creativity and innovation. Many languages where born, ended up in the internet itself. The HW was still very expensive and followed moores law for years.  Microsoft got the dominant player. This generated the complexity in the DC, since the legacy has to work with the new stuff. Silos where build and middleware controlled the systems.
“Suddenly” Google and Facebook like companies where borne and storage, CPU and Network more or less have been given for free. They connected billions of users with millions of applications. The App century where born, lets call it third platform. And again the developers paved the way. From my perspective most of the current App`s are specialized browsers which have a nice interface to Informations.
Here we go thats the buzzword we will see more and more in the future. Information, which is data with meaning, was always generated in in companies and kept very close. Now OpenData appeared, mobile devices generate more data and Information suddenly could be enriched to get new business models in place.
Have, in platform 1 and 2, business leaders invest in DataCenters, not knowing how large the business will go, now they can purchase non the trip and on demand via a credit card, nice.
Many apps and services consumers today buy will leverage such infrastructure.
Hence, guess what, what moste of the the startups leverage this model.

 

Pivotal Labs

Pivotal Labs (Photo credit: teamstickergiant)

 

 

Now how to disrupt this growing business?
The answer is in the definition what business. Large global companies will have big troubles to run their business under the current offerings. Even the current cloud provider adopt fast and be very creative, there will be a lot of legacy which cannot transported. Like the mainframes not to the client-server architectures. The key is in combination, or bridging. Investments which have been taken and will be taken to support the current business models have to be adjusted for the next generation architecture. EMC`s federation approach is aiming for that. Complexity of Infrastructure will be solved by Converging and Software Defined – X concepts, Optimization and orchestration of the infrastructure vmware is leading the industry by a magnitude and pivotal will provide the open platform architecture to combine the business needs form the current and future demands.
When we talk technology, Cloud Foundry and major parts of Pivotal One, which comprises many supreme technologies are open and crowed developed to exponential capture the great ideas from this planet. This is the real disruption. In Platform one it was very country centric, Platform two was dominated by the thinking of the western world, the next platform has to address a global demand and population.

Enhanced by Zemanta

Why Standards matters


Do we really need standards in infrastructure?

Mandelbrot, Designed by Frax for iPAD

Mandelbrot by Frax for iPad

Since a couple of years I have discussions with CIO`and other technology employees in global companies around standards. In fact the introduction of SAN as a protocol standard enabled the largest consolidation and optimization in the DC, started early 2000 until now. One of the reasons is that there has to be a maturity of the market and also a common sense that there is no benefit of defining the “company” standard.
As we have seen in the telco industry, over years companies have define there own phone infrastructure, it has total changed now to a VOiP driven model with a lot of benefits. Same we see nowadays in the DC with the appearance of the converged infrastructure. Hence this approach is not yet mature enough to have one single standard and the manufacturer of infrastructure and VAR´s define there own way. This currently has major impacts at customer sites since there will be a next generation converged model with is more defined thru organizations like SNIA and open stack models, or it will be a 3rd platform approach more likely, which may be called cloud.
What ever the next years will bring the key here is that decisions are not driven only on price more on the TCO and the future of the proposed architecture.
It is obvious to see that innovations cycles will increase and the demand for more flexibility on the business side with drive different infrastructure needs.

What happened on the manufacturer side?

Keep it simple is the demand which was given by the creators of the 3rd platform. However, in the enterprise of global customers the IT still runs in the 1st or mostly on the 2nd platform. This has to be taken in considerations. Associations like SNIA take this as basis and define standards which span the bridge between the new innovative and the current business demand IT requirements. Industry standards take a while to establish and of the manufacturers develop there own “standard” to keep customers closer. In the early stages of new technology this many be very convenient, but with maturity of the approach the move to an industry adoption will become necessary.
The same happened currently in SNIA. The standards around the traditional SAN is defined and also management capabilities like SMI-S are defined and adopted. New areas like Big Data, Object Storage, Analytics and Flash are more in the definition phase and the manufacturer define there own strategy and API´s. SNIA is working deep with the vendors to get the industry better shaped here. Seeing still a lot of startups appearing and innovation happening there too, the industry standard definition is still in the starting points, but customers would be wise to ask for certificated products to make the TCO and technology adoption more efficient to serve not only the cost model also the new demands of their business.

Madelbrot by Frax

Madelbrot by Frax

 

 

 

 

 

 

 

 

 

 

 

 

 

Enhanced by Zemanta

The MS-Dos Phase Of BigData


The first developers of IBM PC computers negle...

The first developers of IBM PC computers neglected audio capabilities (first IBM model, 1981). (Photo credit: Wikipedia)

Most recently I was invited to talk to various CIO´s in the German context. The focus of my presentation was about the next years and what to expect from society, how IT changes everything, how to prepare and what to educate the employees on. IDC talked about the concept of the 3rd Platform. Thinking about the BigData hype while preparing the day, I was reminded on my times with MS-Dos. There was not only one approach to the “PC-Aera”. There was DR-Dos, Apple, still Mainframes etc.

One of the reasons MS-Dos gets more attention was the affinity to IBM and the combination of a device with a market place at this time. Also IBM let others open the space to develop software, add-ons for the PC etc. This generated momentum.

Seeing back the development to Windows 7, has taken a close combination of physics, like Intel, Software, like Microsoft and a change in the market.

The same we currently experimenting with cloud, virtualization and the future stage of an Software defined Datacenter, I guess we have long passed the MS-Dos phase.

So what is the MS-Dos phase?

Plenty of players, not a defined market, overselling of functionality, misleading abbreviations and very poor educated experts, best are from the vendors itself.

With all my talks to the CEO´s, CTO´s of industries, it has the same questions open:

  • What can I do with this technology?
  • Who can help me to build out the business context? and
  • Is this already time to invest into?

There is a clear answer, it depends !

It depends on the maturity of the IT department:

  • are they still maintaining the IT or do they drive innovative, business related IT processes.
  • Where is the infrastructure? Is the IT department still deploying Storage, Network and Servers, or they running a converged infrastructure with a services layer?
  • Do IT departments still focus on education on infrastructure or they hiring Data Scientists ?
MD Tag: MS DOS

MD Tag: MS DOS (Photo credit: shawnblog)

If you can answer this for your IT department, you are much closer to leave the MS-Dos phase behind you.

In the past, many business models have been established and run for years. This will be no longer true in many traditional businesses. Also this business is often in MS-Dos phase, but it will adapt thru the new market drivers, consumeration, urbanization, mobility, Maschine-to-Maschine decision processes and new adaptive computation processes which adjust the business to the demands of the users faster than realtime.

New concepts are unpredictable, like crowd sourcing, open source, raspberry pi physics, and machine learning. There is only one big truth, from MS-Dos to OS X it will take not 20 years, to will happen tomorrow.

Enhanced by Zemanta

My top 10 anticipations for 2013


The cloud will emerge in all parts of the IT

More services have been emerged in 2012, the adoption will drive more cloud, and more cloud offerings will parer. This circle will speed-up and drive more Enterprises in evaluating the next private cloud.

bigdata_network

The engine will be build in the manufacturing not at customer premise

In many enterprises 2012 has started a evaluation of converged infrastructure. Since most of the used components are standardized the „build on customer site“ or „do-it-by-myself“ will be more and more questioned by the CFO. Like in the past where whole servers where build by locals or enterprises itself, it is obvious that the purchasing department will have a different look when it comes to TCO. This will also start to include backup and security.  The order will be moving to workloads and software demands rather than core´s and PB of storage or Network interconnects. The VCE model will become the standard delivery methods for modern DC architectures.

Consummation of IT will drive more business adoption in the cloud

The DC will have more direct contact with the end customers. This was in 2012 one of the main drivers, build thru Google/Android and Apple/iCloud. This will move faster. The new paradigm enables enterprises to optimize cost and optimize the business model when direct talk to the end-customers. In 2013 we will see that this shifts also into computer-to-computer relationships. The interaction of b2b mash will lead to faster purchasing and information flow to optimize the business, and early adopter will be the winners. Key here is an agile IT infrastructure.

Cloud will feed BigData, BigData will enable more cloud

Each one is depended from another. With more cloud, the information streams are better to combine for BigData Analysis, with more BigData analysis and applications, more agile computing environments (called cloud) are necessary. This will be a big trend, but depending on the industry. New businesses will learn to use the information streams, new forms of analytics will be generated to enhance the processes and decision making progress.

Flash will be leading the storage industry

The last years flash was used as replacement for spinning disk. This has a huge impact in performance and utilization of computing power. Man-years of brain power where spent to develop smart algorithms to leverage the new TIER in the storage. This will continue to evolve and with the drop in price and increase of capacity the market will grow dramatically. However flash has not only be squeezed into  spinning disk, it also allowed to transform and implement new algorithm´s to utilize the power of flash. When IO´s no longer the limiting factor, CPU power can be used to leverage the performance in the storage system to do much more tasks which delivering high performance to the server´s.

The CIO will refocus on the core: the Information

With all the changes in IT, the transformation of the roles in the IT management will not stop. Since the DC gravitate to the information it is obvious to see that the CIO will be the master of the information. Not only the information processing in the own DC also the information needed form outside, like BigData applications, also informations given to outside, i.e. to customers. New technologies like the app horizon manager from VMware will support this transition and the CIO and team will be transformed into the information broker, security agent and consultant for the business lines.

Standardization will enable more productivity

One aspect of going from private cloud to hybrid is the standardization. Many companies, like Microsoft, Google or Amazon u.o. define the API and push them as de-facto standard. Experience showed that the early adopter often drive this, but run against a wall in a couple of years. Ethernet/IP and FC would not have been so  broadly accepted if there was not a standards body formed. We see currently various associations to  take on the role in the cloud, like the SNIA organization. This is the only way to help the DC out of the „do it by my self“ and focus on more business relevant tasks. The engine (converged infrastructure) will be developed and assembled in the vendors premise and the Enterprise DC managers can focus on the utilization.

The storm on clouds will drive orchestration

When, in the past, virtualization was introduced to customers, the VM landscape was able to be controlled by man kind. It is similar to the SAN in the early 2000´s when the storage array was still close to a server. This will continue to chance, when we will see 10.000 of VM´s in modern DC architectures. The orchestration will lead to the agility which is necessary to drive more business flexibility. Vendors, led by VMware,  will provide here more sophisticated solutions to automate.

Keys.

Keys. (Photo credit: Bohman)

Security and Trust will be in the middle of everything

Since Information is the key, like it was in the past but now it is a more open world, to secure is one of the key elements. The business will ask for answers, and companies like RSA will lead the way. Not only to secure also to trust other organizations is essential. With new regulations and demands of the business information has to be more trustworthy.

InMemory systems will draw investments from infrastructure

Since InMemory systems show more demand on the „main“-Memory than Hi-IO rates, it will re-architect the infrastructure in this area. 2013 will show if this new technologies will add new fields of applications or replace other. Technologies like GemFire, Hana u.o. will drive faster decision making and new infrastructure architectures. Combined with flash companies like SAP and EMC will drive the Industry here.

Enhanced by Zemanta