Transforming the Enterprise


In the recent years the change in a lot of industries have arisen from an traditional business approach, which was developed over decades to a software defined version of that. There are compelling reasons why this has happened.

(c) by presentation load

(c) by Presentation load

When the industry develops products often it takes decades that this comes out. Take cars for example, all development from cars currently in mass production are start to build and design in the last 5 -10 years. Updates will often go into the next generation of the product cycle.

This is obvious if it is mechanical, but on software it can be much faster adopted. A good example is Tesla Motors which changed the Industry with a concept building a computer in from of a car. Nightly software is updated over the air and new functionality is available for the drive or passenger. But not only this has changed also the selling of that kind of car is different. While for traditional car dealers it is a exercise to train all the sales personal on new function and features, new leasing models or service capabilities to explain this to the customers, modern companies change the sales structure to the internet with an easy to update and adjust model. This leads that options and selling capabilities more depend on the flexibility and creativity of the company, not on the salesforce and their adaptability. The new model traditional Enterprises stumble into demands deeply a adoption of an agile and innovative behavior and processes to leverage the demand and open new segments of making business with.

Why is this happen

Because it is possible. With the appearance of cloud and the models supported thru that, Startups have shown that it is easy to build business without a large invest into infrastructure or DC. Even more, in the past you have to ask investors for a large amount of money to build the DC now you can pay while you build your business. This is much more enabling the investment of the capital in the business model and not into the IT landscape. But this is only one aspect. With the commodization of IT resources, and the container based IT, it is much more cost efficient and reliable to build enterprise class IT with a minimum of investment. However, there is a trap many companies will fall into, which is standardization. Currently there is a believe that one cloud standard, driven by cloud providers, can be the right one, but history has shown that this will lead to more cost and will be replaces in time by an Industry association. We see this on the horizon with OpenStack already, which this is still far of enterprise ready. The key will also be more in the PaaS layer with open software, like CloudFoundry and Docker, which opens a broader Eco space for applications and operations.

Innovation HandIllustration by Dinis Guarda

Innovation HandIllustration by Dinis Guarda

So what about to enable the “New” Enterprise model

The new model will be driven thru innovation in software and applications. With my daily talks to large companies and customers many of them think about how to implement this two aspects into their business process modelling. Often it is driven out of the IT department, but the link to the business and the drivers are missing or simply not established. T I see large enterprises and global companies investing in application development thru the Line of Business and building a second IT knowledge, which is more enrich with the business than the agile development. This not only leads often to a wrong assessment of the best development environment, it also creates a new class of information islands. In the long run this will not be the right innovative approach for many enterprises, but it let adopt and compete with the new kids on the block, the startups, much better. My advise to the CIO and cloud architects is always to engage actively with the CIO departments and help them to change to a more agile and innovative model, we call that, continuous innovation, but also get in return the IT expertise to make the right strategic decisions for the company.

IT provider, like EMC and the federation, enables thsi process and guide also thru that, with various iterations EMC has possibilities to analyze the  current status of an IT department and show the path from a 2nd platform concept to the modern web scale architecture, the 3rd platform concept demands. Since this is not a “shoot once and forget” also in IT terms the “New” model is a constant change. Was it in the past a management of resources and strive form more synergy and “innovation” thru new HW/SW will be the next decade the IT departments more a broker of public and private cloud, may be also for other companies as an additional service.

How to proceed ?

It is not simple and has to be step by step, since the current change of the business model in many verticals not only driven thru development and operation aspects, it also deeply influenced thru big data concepts, which often lead to a Internet Of Things discussion. Silos and public cloud may be an answer, the key to success I see in many cases with a joint effort of the business units and the IT responsible people in the enterprise.

The new IT Landscape


A view from the Infrastructure

In the last 15 years the Infrastructure landscape was defined by demands of the business. This will of course not change. However the approach that one business line demands middleware X another middleware Y will stop. There is a profound reason for that.

In the last couple of years the physic run the Infrastructure has dramatically comodiasied. This has reach a point where the saving for large enterprise no

Featured image

longer get in significant dimensions. The efficiency thru Server Virtualization and nowadays Storage Virtualization has reached in some enterprises more than 80%. With new storage and server orchestration layers and additional concepts like the enterprise hybrid cloud (EHC) this can be tweaked more, but needs first a different approach to the IT operation.

Key here is private cloud, which is similar to the public could offerings, of course on premise.

So what is the catch?

Mainly the operation. In the traditional datacenter, many enterprises and global operational IT departments have build a structure to map the silos approach of the LoB (Line of Business). You will find functions focused on Server, Storage, Networking, Databases, Middleware etc. Each of them have coordination functions with the LoB and cross functional sections. Lost of talks I have with those entities in the IT department always claim that they can do that better than external companies like VCE, which offers converged Infrastructure. Also many of them hide behind the “vendor-lock-in” argument.

On the other side we see that this cost the companies a fortune. Often this IT departments cover 70% of their cost with this, or the other way they can save a lot of that.

What has changed ?

With the concept of “as-a-service”, IT has the ability to automate many tasks and build a software layer as the final governance. With new concept of SLA build into the Software defined components IT personal no longer has to pan, define think-about and run it. Combined with the Converged Infrastructure and the possibilities of Software defined it changes the silos approach to an more holistic view of the datacenter. This does not only save cost and transport test and development of the infrastructure back to the vendor, it also allows higher integration of resources to drive more efficiency.

How does LoB react ?

Often they already there. With offerings of a public cloud the development of new software happens in this organizations often without the IT department involvement. This is a major concern of the CIO and CDO which I here very often. LoB´s look at the business outcome, they have alternatives to the internal IT now and they move off.

So what is next?

From ym view a lot will come in to analyze of the current state of the IT department and how mature this is already in the as-a-service transformation. There are various of offerings like the IT Transformation Workshop of EMC to define and reshape the IT landscape. Have a look at that.

So what with the applications?

Not so simple. There will be three types of applications found in many of the enterprises.

Applications which only deliver information, exist because of historical reasons. Others are monolithic large Enterprise Apps, like SAP, or Oracle Applications the thrid one are new apps for the new business lines touching Web, Mobile, social and cloud.IT-Transformation-Storymap1

For the first, I would retire them and replace that by a database delivering the results. Maybe there are apps no longer used, but nobody realize that? Shut them down. The 2nd kind is more tricky, and have to looked at case by case build a migration strategy and this may take some mont/years. The last I would put immediately on the new concept of Infrastructure.

So what is the key characteristics of this infrastructure?

Automation and orchestration, comodization and standardization. To drive more cost out of the IT the next generation of architecture have to follow this rules. More that that it has to build an independent layer between the physic and the applications. An interface between the resources and the applications. Efficiency and time to provisioning can be only gained with automation. Modern architecture drive provisioning down from weeks to days or even hours, defining the SLA and report back the cost of the selected SLA`s. Also it reports back whether a service breached the SLA or has performed in the payed and agreed parameters.

Finally all this journey start with the ability of the IT department to change and understand the journey of the private cloud.

Image courtesy of pro-physic.de, EMC Corporation

Read More:

http://itblog.emc.com/category/it-transformation/

https://blogs.vmware.com/cloudops/it-transformation

Disrup the Disruptor


Platform

Platform (Photo credit: Geir Halvorsen)

In one of the recent interview the CEO from Pivotal, Paul Maritz, stated that he wants to disrupt the disrupter. The AWS platform offering is addressing a majority of the public cloud demands today and Pivotal wants to play here strong too.

So what is the disruption AWS is addressing?
“Fast”, DataCenter by credit card, to name one. “Easy” maybe another. “Cheap” is also stated. All this has nothing to do with a conventional DC, more the opposite, slow, complex and expensive is more what CIO´s today see in their environments. This is the disruption, there is only one missing piece, “Trust” or “Risk”. Is this not what DC are build for?
Let´s circle back in time, where the Mainframe was born, lets call that a platform one. The majority of applications written there have been determined the production systems. Of course there have been litte users and litte amount of demands, but the origin was done thru the developers.
History has shown that there was only one player left after 30 years. The programs still running and working, but hard to maintain and not able to be inline with current standards of customer experience.
With the age of the PC and more with appearance of Linux and x86 architecture the second platform was born, we called that client-server architecture. Also here the developers led the way with creativity and innovation. Many languages where born, ended up in the internet itself. The HW was still very expensive and followed moores law for years.  Microsoft got the dominant player. This generated the complexity in the DC, since the legacy has to work with the new stuff. Silos where build and middleware controlled the systems.
“Suddenly” Google and Facebook like companies where borne and storage, CPU and Network more or less have been given for free. They connected billions of users with millions of applications. The App century where born, lets call it third platform. And again the developers paved the way. From my perspective most of the current App`s are specialized browsers which have a nice interface to Informations.
Here we go thats the buzzword we will see more and more in the future. Information, which is data with meaning, was always generated in in companies and kept very close. Now OpenData appeared, mobile devices generate more data and Information suddenly could be enriched to get new business models in place.
Have, in platform 1 and 2, business leaders invest in DataCenters, not knowing how large the business will go, now they can purchase non the trip and on demand via a credit card, nice.
Many apps and services consumers today buy will leverage such infrastructure.
Hence, guess what, what moste of the the startups leverage this model.

 

Pivotal Labs

Pivotal Labs (Photo credit: teamstickergiant)

 

 

Now how to disrupt this growing business?
The answer is in the definition what business. Large global companies will have big troubles to run their business under the current offerings. Even the current cloud provider adopt fast and be very creative, there will be a lot of legacy which cannot transported. Like the mainframes not to the client-server architectures. The key is in combination, or bridging. Investments which have been taken and will be taken to support the current business models have to be adjusted for the next generation architecture. EMC`s federation approach is aiming for that. Complexity of Infrastructure will be solved by Converging and Software Defined – X concepts, Optimization and orchestration of the infrastructure vmware is leading the industry by a magnitude and pivotal will provide the open platform architecture to combine the business needs form the current and future demands.
When we talk technology, Cloud Foundry and major parts of Pivotal One, which comprises many supreme technologies are open and crowed developed to exponential capture the great ideas from this planet. This is the real disruption. In Platform one it was very country centric, Platform two was dominated by the thinking of the western world, the next platform has to address a global demand and population.

Enhanced by Zemanta

The MS-Dos Phase Of BigData


The first developers of IBM PC computers negle...

The first developers of IBM PC computers neglected audio capabilities (first IBM model, 1981). (Photo credit: Wikipedia)

Most recently I was invited to talk to various CIO´s in the German context. The focus of my presentation was about the next years and what to expect from society, how IT changes everything, how to prepare and what to educate the employees on. IDC talked about the concept of the 3rd Platform. Thinking about the BigData hype while preparing the day, I was reminded on my times with MS-Dos. There was not only one approach to the “PC-Aera”. There was DR-Dos, Apple, still Mainframes etc.

One of the reasons MS-Dos gets more attention was the affinity to IBM and the combination of a device with a market place at this time. Also IBM let others open the space to develop software, add-ons for the PC etc. This generated momentum.

Seeing back the development to Windows 7, has taken a close combination of physics, like Intel, Software, like Microsoft and a change in the market.

The same we currently experimenting with cloud, virtualization and the future stage of an Software defined Datacenter, I guess we have long passed the MS-Dos phase.

So what is the MS-Dos phase?

Plenty of players, not a defined market, overselling of functionality, misleading abbreviations and very poor educated experts, best are from the vendors itself.

With all my talks to the CEO´s, CTO´s of industries, it has the same questions open:

  • What can I do with this technology?
  • Who can help me to build out the business context? and
  • Is this already time to invest into?

There is a clear answer, it depends !

It depends on the maturity of the IT department:

  • are they still maintaining the IT or do they drive innovative, business related IT processes.
  • Where is the infrastructure? Is the IT department still deploying Storage, Network and Servers, or they running a converged infrastructure with a services layer?
  • Do IT departments still focus on education on infrastructure or they hiring Data Scientists ?
MD Tag: MS DOS

MD Tag: MS DOS (Photo credit: shawnblog)

If you can answer this for your IT department, you are much closer to leave the MS-Dos phase behind you.

In the past, many business models have been established and run for years. This will be no longer true in many traditional businesses. Also this business is often in MS-Dos phase, but it will adapt thru the new market drivers, consumeration, urbanization, mobility, Maschine-to-Maschine decision processes and new adaptive computation processes which adjust the business to the demands of the users faster than realtime.

New concepts are unpredictable, like crowd sourcing, open source, raspberry pi physics, and machine learning. There is only one big truth, from MS-Dos to OS X it will take not 20 years, to will happen tomorrow.

Enhanced by Zemanta

Learn from the past, think about the future, but live in the now !


When I prepared for my last tests at the University, one of my professors mentioned that they intend not to guide us to find good jobs immediately after the University, the key is that we as graduates have longer term chances to find and keep excellent jobs. At this time I thought it was stupid, my profession starts with my first job and not in 10 years.

Die of an Intel 80486DX2 microprocessor (actua...

Die of an Intel 80486DX2 microprocessor (actual size: 12×6.75 mm) in its packaging. (Photo credit: Wikipedia)

Time showed to me, that it is key to have both in mind when you start and it is essential to stay laser sharp focused.

Same in datacenters, when CIO´s get confronted with the new reality of BYOD, data deluge, cyber attacks and complexity explosion. Positions and processes years worked no longer seem accurate, personal gets confused, and the business units claim to have often more IT knowledge, since they use flexible services on the Internet. The CIO´s and IT managers I talked to often seek for external help, call consultants, which tell them that all has to change, processes has to be reworked and the HC has to decrease.

What a challenge!

Most people underestimate the finest attitude mankind has is: the best ability to change and adapt. Of course this is not easy, people like to be in the comfort zone, don`t like to get moved or learn new skills, move on and modify, even they do this all their lifes.

Most successful IT organizations I know have arranged with that to a superlative, taken the employees on the road and established a culture of change. As we learn from the past the change never stops, we moved from monolithic mainframes to mini´s to server to PC and now to tablets; from proprietary OS to open source from complex instructions CPU´s to RISC and back, from multiple CPU architectures, to multiple server to multiple core architectures. From ASCII terminals to mouse driven interactive PC´s to gesture tablets. From 640 KB RAM to Tb of RAM, from 5 MB Disk drive to 4 TB and now flash technology. So we know the past, we understand that telecommunication 10 years ago has been an enterprise asset and now is an outsourced, VOIP driven business.

So what will the future bring to plan, new devices, like the Google glasses, faster storage, from keyboards to voice driven interaction with computers, to InMemory DB build in CloudOS, from Applications to functions, from physical infrastructures topologies and silos, to software defined datacenters. From Application centric approach to information centric concepts, from blade farms to virtual workloads which independent from the blades will be executed?

We will see computer talk to computers, make decisions for us, devices will generate information, which only relevant for seconds. Batch processing will become too slow to keep pace;

This image was selected as a picture of the we...

This image was selected as a picture of the week on the Farsi Wikipedia for the 13th week, 2011. (Photo credit: Wikipedia)

it will be replaced thru bigdata algorithms.

On the consumption model we will purchase only what we consume, vendors have to deal with end customers enterprises will move legacy workloads to specialized and focused workbenches, all information has to be secured and trusted to be transferred.

So the key for the future will all about information, generates, capture, store, manipulate, destroy and analyze, we call it the gravity is on information.

Based on that, CIO´s and IT managers have to and do act in the now, to prepare. Build an agile infrastructure, make our investments in the Datacenter on skills, new technology, information and big data, that supports the workload management and secure the information, which this is critical to the enterprise or the privacy of an employee or customer. Investments in IT of an enterprise should help to build and prepare an agile datacenter around the information to be ready for the near future and bejond.

Enhanced by Zemanta

The DataCenter Evolution


In the last decade the DC has taken major reworks and design changes. The last years the evolution was marked thru the revolution on the market. Disruptive technology shifts and new classes of devices have changed the demands of a classic DC. Still most of the money is spent to keep the lights on. On average only 25% of the budgets of a datacenter is kept to new business and improve the situation.

The DataCenter

(c) by presentermedia

So what will happen ?

First of all there is not one answer to this question, but we can try to look into the future challenges of IT businesses:

The world will continue to generate huge amounts of informations. Consumeration of IT will continue at a faster pace, with new devices appear. Regulations will change to adapt the new reality of IT, like privacy by default etc. Realtime of Information will overcome, data collection for later processing Security and Trust alliances will be more necessary since Information is more volatile.

In 1999, when I experienced the first SAN implementations many CIOs told me that there will be a limit on data storage since the complexity become too much to handle in the DC with two or more networks. Looking back the technology vendors and IT departments have established a quite good understanding how to train and build out teams of architects for the SAN Networking in parallel with the IP Networks. Now the complexity in DataStorage raises and DC managers think about the way to deal with that. Future will tell, but my view is that the networking will come together with the SDN concepts where the physical network is decoupled from the logical view. In this case the SAN and IP can be managed from one point. Further more with the appearance of the Virtualization the future of the ressource management will be come together. When observing the technology development and the industry demand on information management, the cloud burst and the consumeration of information management the way leads to a generic ressource in IT which is assigned a task. In otherwords, why having a storage device, a computing device or an networking controller or even a PC. it is defined by the demand. There will be an efficient device for each task in the first step, but I strongly believe that in a couple of years the physic will no longer be the architectural definition of the DC. It will be the software.

The Earth

(c) by NASA

So why would a CEO today think about putting money into an own DataCenter or IT-Department ?

Most probably to gain advantage of the competition in having applications bestir customized to the business process. Today I found very interesting that CIO´s get asked by the business to challenges with the rest of the world. It is hard for them since the internal BU´s often not compare fair. The “good enough” principle will only work if this is also internal applicable.

Security demands, internal processes and other guidelines drive the IT-department often in additional cost.

So the question should not be “do I need an own DC of IT-department”, it should be the question about responsibility on the information. There is a reason why the DC is called Datacenter! and the CIO os call “Chief Information Officer”, not “Chief Datacenter Officer”. In this context, it is obvious that the DC no longer is necessary in physical form to exist, it is more the collection of services and Information flow the responsible manager should look for.

Finally the datacenter will become the Touring Maschine !!

Enhanced by Zemanta