People lead the Digital World


In the recent years more and more discussions have been arisen around Internet of Things and the professional version Industry 4.0. Mainly it is a consequent step from the cloud approach the IT Industry currently pushing and before that the first iteration which was called ASP(Application Service Provider).Funny enough all these concepts and implementations are sold under the umbrella of more efficiency and productivity of men or women.

Does this really lead to more productivity ?

Chillyd49b949f2dIt depends… There is not true answer and time will tell. Based on various studies the productivity is real existing, but was consumed by the more complex integration of the digital world. Have the early digital assistants like electronic mail, the Internet and electronic calendar, u.o. led to more productivity of individuals. The recent adding of more and more decisions on „not experts“ led to less productivity at all. Was in the late 90’s an admin which organized many parts of the business, now more and more high paid employees are doing this by themselves. Of course there are a lot of supporting tools. However it is obvious that experts are much better handling the right processes, individuals can du this when more seldom are they do that.

Companies add more software to improve, means optimize the cost further, to support this evolution, but end of the day the often individuals take more time to get the process done with they think have the best benefit from. This leads in less productivity. Maybe not for the company, since a lot of this are taken by extra hours, prior the digital revolution, it was called privat time. Always on and Always available means less productive for the individual.

How about the society ?

Evolution requires education, revolution requires new thinking and more education as well as new processes and procedures. With the current pace IT and the other business change the models many peo- ple could not keep up, so they where often lost and productivity is going down. Does it matter. Yes ! We have to start to justify the productivity no more on one or a group of individuals, more on the impact in economics. Only this way we can inno- vate further.

Evolution or revolution ?

If enough people adopt to a total new change, we speak from a revolution. Evolu- tion happens every day and is a constant we see since the the early days of mankind. My opinion is that a revolution in society can be good if this is than taken with the majority of people and nobody is left behind by purpose. This is a tricky process. However in the past the revolution and evolution where centered around man kind. In the younger history more and more voices you will hear that talk about a revolution in robotics and artificial intelligence. This will lead that mankind will no longer sit in the driver seat, maybe some minor believe that they can control that revolution.

 

Do we than need still humanity

The question more would follow the „Why“. Algorithm can do things without „error“, robots can work day and night. This will lead for the ultimative questions, for whom? If more and more humans are replaced by smart machines or algorithms, what will than happen with humanity? There is a simple answer from the history they will not survive, maybe some in the first iteration, but end state is with out mankind.Tomatoes Dream

Productivity around humans

Taken this thought, it will take you to the ultima ratio that we have to focus on the man and woman, child and parents, old and young as the center and pole to drive productivity gains. All efforts have to bing the productivity to increase the personal life of people not replace them in the first place. To make this quite right, there will be some duties, work or other stuff which will benefit from removing humans out of the center. the goal seams to me to upskill the starting with the children for a better productivity paired with the skills mankind has to offer as complement.

 

See also

Machines Can`t Flow:

 

IoT – Chapter I: The Things in the Internet and the connection


Why IoT now ?

In the not so young past the miniaturization has cause a variety of new concepts to drive the markets. This was introduced to many markets. In the IT it is called, and was originated, from the move to softwarization of HW. Yes that is the main driver of the digitalization. Was in the 90’s the key on CNC productions, process automation,

People around the world

(c) by presentation load

improving the way humans work and it turned over to the 00’s in more focus on Software. With the concept of Open-Software and mass development, more high layer of software development open the door to connect devices. Not to mention the standards which are born in the IT side, like IP and the ISO models led to a harmonization of the infrastructure. Since around three years the trend of a unique backbone, we call it IT infrastructure, which consist out of HW/SW/Middleware and is transparent to the development of software cased the opening of a window to more and more connecting of different devices. Still many questions are not yet answered, like security, ownership of generated and collected data, responsibility, the governmental, profiling u.o. However one of the burning and open questions in the OO’s was solved with the introduction of HADOOP and MAP-Reduce concepts. Combined with the open software approach it defined the starting point of the big data area, or I would call it the era of information. Now the Internet of Things was born. With combination of the Google and Apple for mobile as well as Facebook and RedHat for Information collection and backbone OS the collected data was much more easy to personalize and enlighten with more data sources collected somewhere else. Of course there are still many companies opt-out of the sharing and implement own standards to hold on the proprietary formats of data. On the long run they will follow the market and join the collective, I am quite sure and the history has proven this more than one time.

What is IoT ?

The basics of Internet of things is the collection of data with uncounted of variations of devices. It can be from machine, human body sensors, cars, trains, databases, watches, sensors on environmental, mobile devices  among many others possibilities. Smart homes and smart cities are the next steps which the industry currently targeting.
In addition to the devices and island of data producing entities the Internet of Things is marked thru the collection and combination of the data. These are stored in various new concepts of DB’s. At the end the key is to drive new insights out of more data. So in essence the Internet of Things generate facts from the billions of sensors and combination with other data stored in the Internet.

What is than Industry 4.0

Under the term Industry 4.0, the digitalization of the process automation industry is often mentioned. From my point this is only a small fraction. The Industry 4.0 describes all attempts to digitalize the processes in the industry. This covers retail,  manufacturing, connected cars, aerospace, transportation as well as healthcare and other industries. The focus here is not only to improve the quality of the data, it drives also a totally new way to go to market: The as a service approach. This means that the companies no longer supply the offerings as a one off, more as a constant service. It is like the telephone companies do already in there business models. Other companies now driven by many startups to do the same in the Business to Business (B2B) efforts.

from http://fabiusmaximus.com

Future is Today

What is next?

The digitalization will not come to an end. It will consume more and more industries. Like the mathematicians would say „The World is build on numbers and functions“ This process will drive IT and and Information skills into business units. The traditional IT departments will be commoditized and often move to a provider. I see that happening in the next 10 years, like it happened to the telekom models in the late 90’s. Today no large enterprise is running his own telecommunications department. It is all integrated in the IT. In the next 5 years all non critical systems will be run by a specialized provider. Global and large enterprise have to understand the impact of open software, the information and data science. With the sharing of information and connection of things security w

ill be a critical asset to understand. Information and data will be assets which will find the way on the balance sheet.
In the next years it will not be the question what, more the question why the companies need still own IT

 

Transforming the Enterprise


In the recent years the change in a lot of industries have arisen from an traditional business approach, which was developed over decades to a software defined version of that. There are compelling reasons why this has happened.

(c) by presentation load

(c) by Presentation load

When the industry develops products often it takes decades that this comes out. Take cars for example, all development from cars currently in mass production are start to build and design in the last 5 -10 years. Updates will often go into the next generation of the product cycle.

This is obvious if it is mechanical, but on software it can be much faster adopted. A good example is Tesla Motors which changed the Industry with a concept building a computer in from of a car. Nightly software is updated over the air and new functionality is available for the drive or passenger. But not only this has changed also the selling of that kind of car is different. While for traditional car dealers it is a exercise to train all the sales personal on new function and features, new leasing models or service capabilities to explain this to the customers, modern companies change the sales structure to the internet with an easy to update and adjust model. This leads that options and selling capabilities more depend on the flexibility and creativity of the company, not on the salesforce and their adaptability. The new model traditional Enterprises stumble into demands deeply a adoption of an agile and innovative behavior and processes to leverage the demand and open new segments of making business with.

Why is this happen

Because it is possible. With the appearance of cloud and the models supported thru that, Startups have shown that it is easy to build business without a large invest into infrastructure or DC. Even more, in the past you have to ask investors for a large amount of money to build the DC now you can pay while you build your business. This is much more enabling the investment of the capital in the business model and not into the IT landscape. But this is only one aspect. With the commodization of IT resources, and the container based IT, it is much more cost efficient and reliable to build enterprise class IT with a minimum of investment. However, there is a trap many companies will fall into, which is standardization. Currently there is a believe that one cloud standard, driven by cloud providers, can be the right one, but history has shown that this will lead to more cost and will be replaces in time by an Industry association. We see this on the horizon with OpenStack already, which this is still far of enterprise ready. The key will also be more in the PaaS layer with open software, like CloudFoundry and Docker, which opens a broader Eco space for applications and operations.

Innovation HandIllustration by Dinis Guarda

Innovation HandIllustration by Dinis Guarda

So what about to enable the “New” Enterprise model

The new model will be driven thru innovation in software and applications. With my daily talks to large companies and customers many of them think about how to implement this two aspects into their business process modelling. Often it is driven out of the IT department, but the link to the business and the drivers are missing or simply not established. T I see large enterprises and global companies investing in application development thru the Line of Business and building a second IT knowledge, which is more enrich with the business than the agile development. This not only leads often to a wrong assessment of the best development environment, it also creates a new class of information islands. In the long run this will not be the right innovative approach for many enterprises, but it let adopt and compete with the new kids on the block, the startups, much better. My advise to the CIO and cloud architects is always to engage actively with the CIO departments and help them to change to a more agile and innovative model, we call that, continuous innovation, but also get in return the IT expertise to make the right strategic decisions for the company.

IT provider, like EMC and the federation, enables thsi process and guide also thru that, with various iterations EMC has possibilities to analyze the  current status of an IT department and show the path from a 2nd platform concept to the modern web scale architecture, the 3rd platform concept demands. Since this is not a “shoot once and forget” also in IT terms the “New” model is a constant change. Was it in the past a management of resources and strive form more synergy and “innovation” thru new HW/SW will be the next decade the IT departments more a broker of public and private cloud, may be also for other companies as an additional service.

How to proceed ?

It is not simple and has to be step by step, since the current change of the business model in many verticals not only driven thru development and operation aspects, it also deeply influenced thru big data concepts, which often lead to a Internet Of Things discussion. Silos and public cloud may be an answer, the key to success I see in many cases with a joint effort of the business units and the IT responsible people in the enterprise.

The new IT Landscape


A view from the Infrastructure

In the last 15 years the Infrastructure landscape was defined by demands of the business. This will of course not change. However the approach that one business line demands middleware X another middleware Y will stop. There is a profound reason for that.

In the last couple of years the physic run the Infrastructure has dramatically comodiasied. This has reach a point where the saving for large enterprise no

Featured image

longer get in significant dimensions. The efficiency thru Server Virtualization and nowadays Storage Virtualization has reached in some enterprises more than 80%. With new storage and server orchestration layers and additional concepts like the enterprise hybrid cloud (EHC) this can be tweaked more, but needs first a different approach to the IT operation.

Key here is private cloud, which is similar to the public could offerings, of course on premise.

So what is the catch?

Mainly the operation. In the traditional datacenter, many enterprises and global operational IT departments have build a structure to map the silos approach of the LoB (Line of Business). You will find functions focused on Server, Storage, Networking, Databases, Middleware etc. Each of them have coordination functions with the LoB and cross functional sections. Lost of talks I have with those entities in the IT department always claim that they can do that better than external companies like VCE, which offers converged Infrastructure. Also many of them hide behind the “vendor-lock-in” argument.

On the other side we see that this cost the companies a fortune. Often this IT departments cover 70% of their cost with this, or the other way they can save a lot of that.

What has changed ?

With the concept of “as-a-service”, IT has the ability to automate many tasks and build a software layer as the final governance. With new concept of SLA build into the Software defined components IT personal no longer has to pan, define think-about and run it. Combined with the Converged Infrastructure and the possibilities of Software defined it changes the silos approach to an more holistic view of the datacenter. This does not only save cost and transport test and development of the infrastructure back to the vendor, it also allows higher integration of resources to drive more efficiency.

How does LoB react ?

Often they already there. With offerings of a public cloud the development of new software happens in this organizations often without the IT department involvement. This is a major concern of the CIO and CDO which I here very often. LoB´s look at the business outcome, they have alternatives to the internal IT now and they move off.

So what is next?

From ym view a lot will come in to analyze of the current state of the IT department and how mature this is already in the as-a-service transformation. There are various of offerings like the IT Transformation Workshop of EMC to define and reshape the IT landscape. Have a look at that.

So what with the applications?

Not so simple. There will be three types of applications found in many of the enterprises.

Applications which only deliver information, exist because of historical reasons. Others are monolithic large Enterprise Apps, like SAP, or Oracle Applications the thrid one are new apps for the new business lines touching Web, Mobile, social and cloud.IT-Transformation-Storymap1

For the first, I would retire them and replace that by a database delivering the results. Maybe there are apps no longer used, but nobody realize that? Shut them down. The 2nd kind is more tricky, and have to looked at case by case build a migration strategy and this may take some mont/years. The last I would put immediately on the new concept of Infrastructure.

So what is the key characteristics of this infrastructure?

Automation and orchestration, comodization and standardization. To drive more cost out of the IT the next generation of architecture have to follow this rules. More that that it has to build an independent layer between the physic and the applications. An interface between the resources and the applications. Efficiency and time to provisioning can be only gained with automation. Modern architecture drive provisioning down from weeks to days or even hours, defining the SLA and report back the cost of the selected SLA`s. Also it reports back whether a service breached the SLA or has performed in the payed and agreed parameters.

Finally all this journey start with the ability of the IT department to change and understand the journey of the private cloud.

Image courtesy of pro-physic.de, EMC Corporation

Read More:

http://itblog.emc.com/category/it-transformation/

https://blogs.vmware.com/cloudops/it-transformation

The Human Body, the most advanced factory – the real big data


Since mankind was self aware we seek to understand how the human body and functions work and how the soul is integrated in the whole system. It has taken in the early days of earth to now to optimize the factory and develop higher function, which

we like to call consignees, brain, and social empathy. The human ecosystem in in essence a factory. In numbers, every day around 20 Billions of cells will be replaced by new ones out of the 10.000 Billions we have. This means that each 10 years your body is rebuild. This cost energy in addition we also loose energy, around 50 to 360 Watt for keeping the factory running. By the way most fitness relevant trackers take this in consideration. On a daily basis this will add up to around 2,9 kWh. In the case of heavy work or sports this will go up of course.

Comparing the DataCenter with our body, we will find astonishing parallelism, like the nerve system and the network, like the blood and the power, like the heating and cooling. This comes from the same physics we and the DC act in. If we dig into that, we will see that the nature has solved most of the demands with much more creativity. Also we find dedicated systems, which autonomous action to keep the fabric running. I think on the limbic system

EMC DC Durham

EMC DC Durham

and our ability to react on external wounds without having a big escalation. Root cause is done on the fly for the minor issues.

A little more details on this fascinating comparison you can read in the next episodes:

  • Episode I: The Human Body a optimized fully automated factory
  • Episode II: The Blood in the human factory
  • Episode III: The Sensory
  • Episode IV: The Big Data approach in the Human factory
  • Episode V: The Control in the human Factory
  • Episode VI: The CyberControl in the Human factory
  • Episode VII: The Chain of Command in the Human Factory
  • Episode VIII: Automation in the human factory
  • Episode IX: Energy consumption model in the human factory
  • Episode X: Influence of the Soul in the digital Factory
  • Episode XI: Final thoughts on the digital factory

What kind of Analytics ?


In various discussion with my customers and colleagues. I experienced a very controversy discussion around analytic.

Some understand the „what has happens,” also called root cause, others want to predict the future, meaning „what will happen,” a third group tries to answer the question, „what could happen“. Funny enough often is the answer a combination out of a combination of these. When it comes to our very fast decision making and consuming society this can cause some friction.
In the first case, lets call it descriptive analytic, it is the reason many companies invest money to avoid that, but after the incident, not predicted, happens.2434048_300dpi

 

 

Often you find it difficult to get to the root cause, since simple the data or information is no longer available to get to the bottom. In the 2nd case, statistical model show the business based on historical information what can get wrong, lets call it predictive analysis. In the last case, data scientists use machine learning to find the possible future, based on different information streams, it is known as prescriptive analytic. Like in chess where in the first move all possible is and there some prediction could be made, based on the information of the players, over time when more moves have taken the information and moves getting clearer.

 

The human brain is capable of doing this kind of prescription on the fly, we call that experience. By the way a reason why often young managers or entrepreneurs fail, but this is worth another blog. The key is here that the brain has build, by and expert, enough information and channels that the brain could do both predictive and prescriptive.
Having the scientists build the model, means the mathematical representation of the information combination, it can lead to a variety of possible outcomes, we call the data science or machine learning. Key is the the amount of information, it deep and historical very long. This is also on of the boundaries.

 

Many CIO and CTO from companies I talk today do not keep the data, for many reasons, mostly cost. So what can a Data Scientist than do ? Simple he build the model and than run it. Over the time it will be better, preserving the information. Like in the chess where you can see the extreme professional players analyze and predict movements in depth untrained could not imagine , or even provoke movements. This is not yet in this science embedded.

 

A couple of week ago I was fortunate to see a start-up and talk to the founders. It was all about customer intimacy and combining information in a company to serve a better customers relationship. I would drive this one dimension deeper. Why not using this as an internal knowledge base. Let´s predict what the corporate user needs to find around a specific subject, not only internal sources also external can be used here.

 

I am still wondering why HR representatives and headhunter not use predictive analysis to identify the right candidate for a job?
Yes you’re right the information in a company is often so filed that we will not have the possibility to combine them. Here we go with the concept of a data lake, building one repository of information and use it for internal and external benefit.

prop-jet engine

However the last topic is the „what could happen“ case, the descriptive Analytics. This is all about possibilities and often risks. I guess this is used in Six Sigma, and also man kind is used to do that. The mother warns the child about the possible accident, when walking on the street. The company strategic plans is all about that scenarios. I think this method means „I do not have enough data to be more precise. „

 

So what to do? Keep all the data, maybe randomized and anonymized, but keep it, because you never know what business can be build out of the treasure the company has generated over years.

 

Welcome in the Age of Information !

 

 

 

 

Disrup the Disruptor


Platform

Platform (Photo credit: Geir Halvorsen)

In one of the recent interview the CEO from Pivotal, Paul Maritz, stated that he wants to disrupt the disrupter. The AWS platform offering is addressing a majority of the public cloud demands today and Pivotal wants to play here strong too.

So what is the disruption AWS is addressing?
“Fast”, DataCenter by credit card, to name one. “Easy” maybe another. “Cheap” is also stated. All this has nothing to do with a conventional DC, more the opposite, slow, complex and expensive is more what CIO´s today see in their environments. This is the disruption, there is only one missing piece, “Trust” or “Risk”. Is this not what DC are build for?
Let´s circle back in time, where the Mainframe was born, lets call that a platform one. The majority of applications written there have been determined the production systems. Of course there have been litte users and litte amount of demands, but the origin was done thru the developers.
History has shown that there was only one player left after 30 years. The programs still running and working, but hard to maintain and not able to be inline with current standards of customer experience.
With the age of the PC and more with appearance of Linux and x86 architecture the second platform was born, we called that client-server architecture. Also here the developers led the way with creativity and innovation. Many languages where born, ended up in the internet itself. The HW was still very expensive and followed moores law for years.  Microsoft got the dominant player. This generated the complexity in the DC, since the legacy has to work with the new stuff. Silos where build and middleware controlled the systems.
“Suddenly” Google and Facebook like companies where borne and storage, CPU and Network more or less have been given for free. They connected billions of users with millions of applications. The App century where born, lets call it third platform. And again the developers paved the way. From my perspective most of the current App`s are specialized browsers which have a nice interface to Informations.
Here we go thats the buzzword we will see more and more in the future. Information, which is data with meaning, was always generated in in companies and kept very close. Now OpenData appeared, mobile devices generate more data and Information suddenly could be enriched to get new business models in place.
Have, in platform 1 and 2, business leaders invest in DataCenters, not knowing how large the business will go, now they can purchase non the trip and on demand via a credit card, nice.
Many apps and services consumers today buy will leverage such infrastructure.
Hence, guess what, what moste of the the startups leverage this model.

 

Pivotal Labs

Pivotal Labs (Photo credit: teamstickergiant)

 

 

Now how to disrupt this growing business?
The answer is in the definition what business. Large global companies will have big troubles to run their business under the current offerings. Even the current cloud provider adopt fast and be very creative, there will be a lot of legacy which cannot transported. Like the mainframes not to the client-server architectures. The key is in combination, or bridging. Investments which have been taken and will be taken to support the current business models have to be adjusted for the next generation architecture. EMC`s federation approach is aiming for that. Complexity of Infrastructure will be solved by Converging and Software Defined – X concepts, Optimization and orchestration of the infrastructure vmware is leading the industry by a magnitude and pivotal will provide the open platform architecture to combine the business needs form the current and future demands.
When we talk technology, Cloud Foundry and major parts of Pivotal One, which comprises many supreme technologies are open and crowed developed to exponential capture the great ideas from this planet. This is the real disruption. In Platform one it was very country centric, Platform two was dominated by the thinking of the western world, the next platform has to address a global demand and population.

Enhanced by Zemanta

Why Standards matters


Do we really need standards in infrastructure?

Mandelbrot, Designed by Frax for iPAD

Mandelbrot by Frax for iPad

Since a couple of years I have discussions with CIO`and other technology employees in global companies around standards. In fact the introduction of SAN as a protocol standard enabled the largest consolidation and optimization in the DC, started early 2000 until now. One of the reasons is that there has to be a maturity of the market and also a common sense that there is no benefit of defining the “company” standard.
As we have seen in the telco industry, over years companies have define there own phone infrastructure, it has total changed now to a VOiP driven model with a lot of benefits. Same we see nowadays in the DC with the appearance of the converged infrastructure. Hence this approach is not yet mature enough to have one single standard and the manufacturer of infrastructure and VAR´s define there own way. This currently has major impacts at customer sites since there will be a next generation converged model with is more defined thru organizations like SNIA and open stack models, or it will be a 3rd platform approach more likely, which may be called cloud.
What ever the next years will bring the key here is that decisions are not driven only on price more on the TCO and the future of the proposed architecture.
It is obvious to see that innovations cycles will increase and the demand for more flexibility on the business side with drive different infrastructure needs.

What happened on the manufacturer side?

Keep it simple is the demand which was given by the creators of the 3rd platform. However, in the enterprise of global customers the IT still runs in the 1st or mostly on the 2nd platform. This has to be taken in considerations. Associations like SNIA take this as basis and define standards which span the bridge between the new innovative and the current business demand IT requirements. Industry standards take a while to establish and of the manufacturers develop there own “standard” to keep customers closer. In the early stages of new technology this many be very convenient, but with maturity of the approach the move to an industry adoption will become necessary.
The same happened currently in SNIA. The standards around the traditional SAN is defined and also management capabilities like SMI-S are defined and adopted. New areas like Big Data, Object Storage, Analytics and Flash are more in the definition phase and the manufacturer define there own strategy and API´s. SNIA is working deep with the vendors to get the industry better shaped here. Seeing still a lot of startups appearing and innovation happening there too, the industry standard definition is still in the starting points, but customers would be wise to ask for certificated products to make the TCO and technology adoption more efficient to serve not only the cost model also the new demands of their business.

Madelbrot by Frax

Madelbrot by Frax

 

 

 

 

 

 

 

 

 

 

 

 

 

Enhanced by Zemanta

The MS-Dos Phase Of BigData


The first developers of IBM PC computers negle...

The first developers of IBM PC computers neglected audio capabilities (first IBM model, 1981). (Photo credit: Wikipedia)

Most recently I was invited to talk to various CIO´s in the German context. The focus of my presentation was about the next years and what to expect from society, how IT changes everything, how to prepare and what to educate the employees on. IDC talked about the concept of the 3rd Platform. Thinking about the BigData hype while preparing the day, I was reminded on my times with MS-Dos. There was not only one approach to the “PC-Aera”. There was DR-Dos, Apple, still Mainframes etc.

One of the reasons MS-Dos gets more attention was the affinity to IBM and the combination of a device with a market place at this time. Also IBM let others open the space to develop software, add-ons for the PC etc. This generated momentum.

Seeing back the development to Windows 7, has taken a close combination of physics, like Intel, Software, like Microsoft and a change in the market.

The same we currently experimenting with cloud, virtualization and the future stage of an Software defined Datacenter, I guess we have long passed the MS-Dos phase.

So what is the MS-Dos phase?

Plenty of players, not a defined market, overselling of functionality, misleading abbreviations and very poor educated experts, best are from the vendors itself.

With all my talks to the CEO´s, CTO´s of industries, it has the same questions open:

  • What can I do with this technology?
  • Who can help me to build out the business context? and
  • Is this already time to invest into?

There is a clear answer, it depends !

It depends on the maturity of the IT department:

  • are they still maintaining the IT or do they drive innovative, business related IT processes.
  • Where is the infrastructure? Is the IT department still deploying Storage, Network and Servers, or they running a converged infrastructure with a services layer?
  • Do IT departments still focus on education on infrastructure or they hiring Data Scientists ?
MD Tag: MS DOS

MD Tag: MS DOS (Photo credit: shawnblog)

If you can answer this for your IT department, you are much closer to leave the MS-Dos phase behind you.

In the past, many business models have been established and run for years. This will be no longer true in many traditional businesses. Also this business is often in MS-Dos phase, but it will adapt thru the new market drivers, consumeration, urbanization, mobility, Maschine-to-Maschine decision processes and new adaptive computation processes which adjust the business to the demands of the users faster than realtime.

New concepts are unpredictable, like crowd sourcing, open source, raspberry pi physics, and machine learning. There is only one big truth, from MS-Dos to OS X it will take not 20 years, to will happen tomorrow.

Enhanced by Zemanta

My top 10 anticipations for 2013


The cloud will emerge in all parts of the IT

More services have been emerged in 2012, the adoption will drive more cloud, and more cloud offerings will parer. This circle will speed-up and drive more Enterprises in evaluating the next private cloud.

bigdata_network

The engine will be build in the manufacturing not at customer premise

In many enterprises 2012 has started a evaluation of converged infrastructure. Since most of the used components are standardized the „build on customer site“ or „do-it-by-myself“ will be more and more questioned by the CFO. Like in the past where whole servers where build by locals or enterprises itself, it is obvious that the purchasing department will have a different look when it comes to TCO. This will also start to include backup and security.  The order will be moving to workloads and software demands rather than core´s and PB of storage or Network interconnects. The VCE model will become the standard delivery methods for modern DC architectures.

Consummation of IT will drive more business adoption in the cloud

The DC will have more direct contact with the end customers. This was in 2012 one of the main drivers, build thru Google/Android and Apple/iCloud. This will move faster. The new paradigm enables enterprises to optimize cost and optimize the business model when direct talk to the end-customers. In 2013 we will see that this shifts also into computer-to-computer relationships. The interaction of b2b mash will lead to faster purchasing and information flow to optimize the business, and early adopter will be the winners. Key here is an agile IT infrastructure.

Cloud will feed BigData, BigData will enable more cloud

Each one is depended from another. With more cloud, the information streams are better to combine for BigData Analysis, with more BigData analysis and applications, more agile computing environments (called cloud) are necessary. This will be a big trend, but depending on the industry. New businesses will learn to use the information streams, new forms of analytics will be generated to enhance the processes and decision making progress.

Flash will be leading the storage industry

The last years flash was used as replacement for spinning disk. This has a huge impact in performance and utilization of computing power. Man-years of brain power where spent to develop smart algorithms to leverage the new TIER in the storage. This will continue to evolve and with the drop in price and increase of capacity the market will grow dramatically. However flash has not only be squeezed into  spinning disk, it also allowed to transform and implement new algorithm´s to utilize the power of flash. When IO´s no longer the limiting factor, CPU power can be used to leverage the performance in the storage system to do much more tasks which delivering high performance to the server´s.

The CIO will refocus on the core: the Information

With all the changes in IT, the transformation of the roles in the IT management will not stop. Since the DC gravitate to the information it is obvious to see that the CIO will be the master of the information. Not only the information processing in the own DC also the information needed form outside, like BigData applications, also informations given to outside, i.e. to customers. New technologies like the app horizon manager from VMware will support this transition and the CIO and team will be transformed into the information broker, security agent and consultant for the business lines.

Standardization will enable more productivity

One aspect of going from private cloud to hybrid is the standardization. Many companies, like Microsoft, Google or Amazon u.o. define the API and push them as de-facto standard. Experience showed that the early adopter often drive this, but run against a wall in a couple of years. Ethernet/IP and FC would not have been so  broadly accepted if there was not a standards body formed. We see currently various associations to  take on the role in the cloud, like the SNIA organization. This is the only way to help the DC out of the „do it by my self“ and focus on more business relevant tasks. The engine (converged infrastructure) will be developed and assembled in the vendors premise and the Enterprise DC managers can focus on the utilization.

The storm on clouds will drive orchestration

When, in the past, virtualization was introduced to customers, the VM landscape was able to be controlled by man kind. It is similar to the SAN in the early 2000´s when the storage array was still close to a server. This will continue to chance, when we will see 10.000 of VM´s in modern DC architectures. The orchestration will lead to the agility which is necessary to drive more business flexibility. Vendors, led by VMware,  will provide here more sophisticated solutions to automate.

Keys.

Keys. (Photo credit: Bohman)

Security and Trust will be in the middle of everything

Since Information is the key, like it was in the past but now it is a more open world, to secure is one of the key elements. The business will ask for answers, and companies like RSA will lead the way. Not only to secure also to trust other organizations is essential. With new regulations and demands of the business information has to be more trustworthy.

InMemory systems will draw investments from infrastructure

Since InMemory systems show more demand on the „main“-Memory than Hi-IO rates, it will re-architect the infrastructure in this area. 2013 will show if this new technologies will add new fields of applications or replace other. Technologies like GemFire, Hana u.o. will drive faster decision making and new infrastructure architectures. Combined with flash companies like SAP and EMC will drive the Industry here.

Enhanced by Zemanta