People lead the Digital World


In the recent years more and more discussions have been arisen around Internet of Things and the professional version Industry 4.0. Mainly it is a consequent step from the cloud approach the IT Industry currently pushing and before that the first iteration which was called ASP(Application Service Provider).Funny enough all these concepts and implementations are sold under the umbrella of more efficiency and productivity of men or women.

Does this really lead to more productivity ?

Chillyd49b949f2dIt depends… There is not true answer and time will tell. Based on various studies the productivity is real existing, but was consumed by the more complex integration of the digital world. Have the early digital assistants like electronic mail, the Internet and electronic calendar, u.o. led to more productivity of individuals. The recent adding of more and more decisions on „not experts“ led to less productivity at all. Was in the late 90’s an admin which organized many parts of the business, now more and more high paid employees are doing this by themselves. Of course there are a lot of supporting tools. However it is obvious that experts are much better handling the right processes, individuals can du this when more seldom are they do that.

Companies add more software to improve, means optimize the cost further, to support this evolution, but end of the day the often individuals take more time to get the process done with they think have the best benefit from. This leads in less productivity. Maybe not for the company, since a lot of this are taken by extra hours, prior the digital revolution, it was called privat time. Always on and Always available means less productive for the individual.

How about the society ?

Evolution requires education, revolution requires new thinking and more education as well as new processes and procedures. With the current pace IT and the other business change the models many peo- ple could not keep up, so they where often lost and productivity is going down. Does it matter. Yes ! We have to start to justify the productivity no more on one or a group of individuals, more on the impact in economics. Only this way we can inno- vate further.

Evolution or revolution ?

If enough people adopt to a total new change, we speak from a revolution. Evolu- tion happens every day and is a constant we see since the the early days of mankind. My opinion is that a revolution in society can be good if this is than taken with the majority of people and nobody is left behind by purpose. This is a tricky process. However in the past the revolution and evolution where centered around man kind. In the younger history more and more voices you will hear that talk about a revolution in robotics and artificial intelligence. This will lead that mankind will no longer sit in the driver seat, maybe some minor believe that they can control that revolution.

 

Do we than need still humanity

The question more would follow the „Why“. Algorithm can do things without „error“, robots can work day and night. This will lead for the ultimative questions, for whom? If more and more humans are replaced by smart machines or algorithms, what will than happen with humanity? There is a simple answer from the history they will not survive, maybe some in the first iteration, but end state is with out mankind.Tomatoes Dream

Productivity around humans

Taken this thought, it will take you to the ultima ratio that we have to focus on the man and woman, child and parents, old and young as the center and pole to drive productivity gains. All efforts have to bing the productivity to increase the personal life of people not replace them in the first place. To make this quite right, there will be some duties, work or other stuff which will benefit from removing humans out of the center. the goal seams to me to upskill the starting with the children for a better productivity paired with the skills mankind has to offer as complement.

 

See also

Machines Can`t Flow:

 

IoT – Chapter I: The Things in the Internet and the connection


Why IoT now ?

In the not so young past the miniaturization has cause a variety of new concepts to drive the markets. This was introduced to many markets. In the IT it is called, and was originated, from the move to softwarization of HW. Yes that is the main driver of the digitalization. Was in the 90’s the key on CNC productions, process automation,

People around the world

(c) by presentation load

improving the way humans work and it turned over to the 00’s in more focus on Software. With the concept of Open-Software and mass development, more high layer of software development open the door to connect devices. Not to mention the standards which are born in the IT side, like IP and the ISO models led to a harmonization of the infrastructure. Since around three years the trend of a unique backbone, we call it IT infrastructure, which consist out of HW/SW/Middleware and is transparent to the development of software cased the opening of a window to more and more connecting of different devices. Still many questions are not yet answered, like security, ownership of generated and collected data, responsibility, the governmental, profiling u.o. However one of the burning and open questions in the OO’s was solved with the introduction of HADOOP and MAP-Reduce concepts. Combined with the open software approach it defined the starting point of the big data area, or I would call it the era of information. Now the Internet of Things was born. With combination of the Google and Apple for mobile as well as Facebook and RedHat for Information collection and backbone OS the collected data was much more easy to personalize and enlighten with more data sources collected somewhere else. Of course there are still many companies opt-out of the sharing and implement own standards to hold on the proprietary formats of data. On the long run they will follow the market and join the collective, I am quite sure and the history has proven this more than one time.

What is IoT ?

The basics of Internet of things is the collection of data with uncounted of variations of devices. It can be from machine, human body sensors, cars, trains, databases, watches, sensors on environmental, mobile devices  among many others possibilities. Smart homes and smart cities are the next steps which the industry currently targeting.
In addition to the devices and island of data producing entities the Internet of Things is marked thru the collection and combination of the data. These are stored in various new concepts of DB’s. At the end the key is to drive new insights out of more data. So in essence the Internet of Things generate facts from the billions of sensors and combination with other data stored in the Internet.

What is than Industry 4.0

Under the term Industry 4.0, the digitalization of the process automation industry is often mentioned. From my point this is only a small fraction. The Industry 4.0 describes all attempts to digitalize the processes in the industry. This covers retail,  manufacturing, connected cars, aerospace, transportation as well as healthcare and other industries. The focus here is not only to improve the quality of the data, it drives also a totally new way to go to market: The as a service approach. This means that the companies no longer supply the offerings as a one off, more as a constant service. It is like the telephone companies do already in there business models. Other companies now driven by many startups to do the same in the Business to Business (B2B) efforts.

from http://fabiusmaximus.com

Future is Today

What is next?

The digitalization will not come to an end. It will consume more and more industries. Like the mathematicians would say „The World is build on numbers and functions“ This process will drive IT and and Information skills into business units. The traditional IT departments will be commoditized and often move to a provider. I see that happening in the next 10 years, like it happened to the telekom models in the late 90’s. Today no large enterprise is running his own telecommunications department. It is all integrated in the IT. In the next 5 years all non critical systems will be run by a specialized provider. Global and large enterprise have to understand the impact of open software, the information and data science. With the sharing of information and connection of things security w

ill be a critical asset to understand. Information and data will be assets which will find the way on the balance sheet.
In the next years it will not be the question what, more the question why the companies need still own IT

 

Transforming the Enterprise


In the recent years the change in a lot of industries have arisen from an traditional business approach, which was developed over decades to a software defined version of that. There are compelling reasons why this has happened.

(c) by presentation load

(c) by Presentation load

When the industry develops products often it takes decades that this comes out. Take cars for example, all development from cars currently in mass production are start to build and design in the last 5 -10 years. Updates will often go into the next generation of the product cycle.

This is obvious if it is mechanical, but on software it can be much faster adopted. A good example is Tesla Motors which changed the Industry with a concept building a computer in from of a car. Nightly software is updated over the air and new functionality is available for the drive or passenger. But not only this has changed also the selling of that kind of car is different. While for traditional car dealers it is a exercise to train all the sales personal on new function and features, new leasing models or service capabilities to explain this to the customers, modern companies change the sales structure to the internet with an easy to update and adjust model. This leads that options and selling capabilities more depend on the flexibility and creativity of the company, not on the salesforce and their adaptability. The new model traditional Enterprises stumble into demands deeply a adoption of an agile and innovative behavior and processes to leverage the demand and open new segments of making business with.

Why is this happen

Because it is possible. With the appearance of cloud and the models supported thru that, Startups have shown that it is easy to build business without a large invest into infrastructure or DC. Even more, in the past you have to ask investors for a large amount of money to build the DC now you can pay while you build your business. This is much more enabling the investment of the capital in the business model and not into the IT landscape. But this is only one aspect. With the commodization of IT resources, and the container based IT, it is much more cost efficient and reliable to build enterprise class IT with a minimum of investment. However, there is a trap many companies will fall into, which is standardization. Currently there is a believe that one cloud standard, driven by cloud providers, can be the right one, but history has shown that this will lead to more cost and will be replaces in time by an Industry association. We see this on the horizon with OpenStack already, which this is still far of enterprise ready. The key will also be more in the PaaS layer with open software, like CloudFoundry and Docker, which opens a broader Eco space for applications and operations.

Innovation HandIllustration by Dinis Guarda

Innovation HandIllustration by Dinis Guarda

So what about to enable the “New” Enterprise model

The new model will be driven thru innovation in software and applications. With my daily talks to large companies and customers many of them think about how to implement this two aspects into their business process modelling. Often it is driven out of the IT department, but the link to the business and the drivers are missing or simply not established. T I see large enterprises and global companies investing in application development thru the Line of Business and building a second IT knowledge, which is more enrich with the business than the agile development. This not only leads often to a wrong assessment of the best development environment, it also creates a new class of information islands. In the long run this will not be the right innovative approach for many enterprises, but it let adopt and compete with the new kids on the block, the startups, much better. My advise to the CIO and cloud architects is always to engage actively with the CIO departments and help them to change to a more agile and innovative model, we call that, continuous innovation, but also get in return the IT expertise to make the right strategic decisions for the company.

IT provider, like EMC and the federation, enables thsi process and guide also thru that, with various iterations EMC has possibilities to analyze the  current status of an IT department and show the path from a 2nd platform concept to the modern web scale architecture, the 3rd platform concept demands. Since this is not a “shoot once and forget” also in IT terms the “New” model is a constant change. Was it in the past a management of resources and strive form more synergy and “innovation” thru new HW/SW will be the next decade the IT departments more a broker of public and private cloud, may be also for other companies as an additional service.

How to proceed ?

It is not simple and has to be step by step, since the current change of the business model in many verticals not only driven thru development and operation aspects, it also deeply influenced thru big data concepts, which often lead to a Internet Of Things discussion. Silos and public cloud may be an answer, the key to success I see in many cases with a joint effort of the business units and the IT responsible people in the enterprise.

What kind of Analytics ?


In various discussion with my customers and colleagues. I experienced a very controversy discussion around analytic.

Some understand the „what has happens,” also called root cause, others want to predict the future, meaning „what will happen,” a third group tries to answer the question, „what could happen“. Funny enough often is the answer a combination out of a combination of these. When it comes to our very fast decision making and consuming society this can cause some friction.
In the first case, lets call it descriptive analytic, it is the reason many companies invest money to avoid that, but after the incident, not predicted, happens.2434048_300dpi

 

 

Often you find it difficult to get to the root cause, since simple the data or information is no longer available to get to the bottom. In the 2nd case, statistical model show the business based on historical information what can get wrong, lets call it predictive analysis. In the last case, data scientists use machine learning to find the possible future, based on different information streams, it is known as prescriptive analytic. Like in chess where in the first move all possible is and there some prediction could be made, based on the information of the players, over time when more moves have taken the information and moves getting clearer.

 

The human brain is capable of doing this kind of prescription on the fly, we call that experience. By the way a reason why often young managers or entrepreneurs fail, but this is worth another blog. The key is here that the brain has build, by and expert, enough information and channels that the brain could do both predictive and prescriptive.
Having the scientists build the model, means the mathematical representation of the information combination, it can lead to a variety of possible outcomes, we call the data science or machine learning. Key is the the amount of information, it deep and historical very long. This is also on of the boundaries.

 

Many CIO and CTO from companies I talk today do not keep the data, for many reasons, mostly cost. So what can a Data Scientist than do ? Simple he build the model and than run it. Over the time it will be better, preserving the information. Like in the chess where you can see the extreme professional players analyze and predict movements in depth untrained could not imagine , or even provoke movements. This is not yet in this science embedded.

 

A couple of week ago I was fortunate to see a start-up and talk to the founders. It was all about customer intimacy and combining information in a company to serve a better customers relationship. I would drive this one dimension deeper. Why not using this as an internal knowledge base. Let´s predict what the corporate user needs to find around a specific subject, not only internal sources also external can be used here.

 

I am still wondering why HR representatives and headhunter not use predictive analysis to identify the right candidate for a job?
Yes you’re right the information in a company is often so filed that we will not have the possibility to combine them. Here we go with the concept of a data lake, building one repository of information and use it for internal and external benefit.

prop-jet engine

However the last topic is the „what could happen“ case, the descriptive Analytics. This is all about possibilities and often risks. I guess this is used in Six Sigma, and also man kind is used to do that. The mother warns the child about the possible accident, when walking on the street. The company strategic plans is all about that scenarios. I think this method means „I do not have enough data to be more precise. „

 

So what to do? Keep all the data, maybe randomized and anonymized, but keep it, because you never know what business can be build out of the treasure the company has generated over years.

 

Welcome in the Age of Information !

 

 

 

 

Disrup the Disruptor


Platform

Platform (Photo credit: Geir Halvorsen)

In one of the recent interview the CEO from Pivotal, Paul Maritz, stated that he wants to disrupt the disrupter. The AWS platform offering is addressing a majority of the public cloud demands today and Pivotal wants to play here strong too.

So what is the disruption AWS is addressing?
“Fast”, DataCenter by credit card, to name one. “Easy” maybe another. “Cheap” is also stated. All this has nothing to do with a conventional DC, more the opposite, slow, complex and expensive is more what CIO´s today see in their environments. This is the disruption, there is only one missing piece, “Trust” or “Risk”. Is this not what DC are build for?
Let´s circle back in time, where the Mainframe was born, lets call that a platform one. The majority of applications written there have been determined the production systems. Of course there have been litte users and litte amount of demands, but the origin was done thru the developers.
History has shown that there was only one player left after 30 years. The programs still running and working, but hard to maintain and not able to be inline with current standards of customer experience.
With the age of the PC and more with appearance of Linux and x86 architecture the second platform was born, we called that client-server architecture. Also here the developers led the way with creativity and innovation. Many languages where born, ended up in the internet itself. The HW was still very expensive and followed moores law for years.  Microsoft got the dominant player. This generated the complexity in the DC, since the legacy has to work with the new stuff. Silos where build and middleware controlled the systems.
“Suddenly” Google and Facebook like companies where borne and storage, CPU and Network more or less have been given for free. They connected billions of users with millions of applications. The App century where born, lets call it third platform. And again the developers paved the way. From my perspective most of the current App`s are specialized browsers which have a nice interface to Informations.
Here we go thats the buzzword we will see more and more in the future. Information, which is data with meaning, was always generated in in companies and kept very close. Now OpenData appeared, mobile devices generate more data and Information suddenly could be enriched to get new business models in place.
Have, in platform 1 and 2, business leaders invest in DataCenters, not knowing how large the business will go, now they can purchase non the trip and on demand via a credit card, nice.
Many apps and services consumers today buy will leverage such infrastructure.
Hence, guess what, what moste of the the startups leverage this model.

 

Pivotal Labs

Pivotal Labs (Photo credit: teamstickergiant)

 

 

Now how to disrupt this growing business?
The answer is in the definition what business. Large global companies will have big troubles to run their business under the current offerings. Even the current cloud provider adopt fast and be very creative, there will be a lot of legacy which cannot transported. Like the mainframes not to the client-server architectures. The key is in combination, or bridging. Investments which have been taken and will be taken to support the current business models have to be adjusted for the next generation architecture. EMC`s federation approach is aiming for that. Complexity of Infrastructure will be solved by Converging and Software Defined – X concepts, Optimization and orchestration of the infrastructure vmware is leading the industry by a magnitude and pivotal will provide the open platform architecture to combine the business needs form the current and future demands.
When we talk technology, Cloud Foundry and major parts of Pivotal One, which comprises many supreme technologies are open and crowed developed to exponential capture the great ideas from this planet. This is the real disruption. In Platform one it was very country centric, Platform two was dominated by the thinking of the western world, the next platform has to address a global demand and population.

Enhanced by Zemanta

Why Standards matters


Do we really need standards in infrastructure?

Mandelbrot, Designed by Frax for iPAD

Mandelbrot by Frax for iPad

Since a couple of years I have discussions with CIO`and other technology employees in global companies around standards. In fact the introduction of SAN as a protocol standard enabled the largest consolidation and optimization in the DC, started early 2000 until now. One of the reasons is that there has to be a maturity of the market and also a common sense that there is no benefit of defining the “company” standard.
As we have seen in the telco industry, over years companies have define there own phone infrastructure, it has total changed now to a VOiP driven model with a lot of benefits. Same we see nowadays in the DC with the appearance of the converged infrastructure. Hence this approach is not yet mature enough to have one single standard and the manufacturer of infrastructure and VAR´s define there own way. This currently has major impacts at customer sites since there will be a next generation converged model with is more defined thru organizations like SNIA and open stack models, or it will be a 3rd platform approach more likely, which may be called cloud.
What ever the next years will bring the key here is that decisions are not driven only on price more on the TCO and the future of the proposed architecture.
It is obvious to see that innovations cycles will increase and the demand for more flexibility on the business side with drive different infrastructure needs.

What happened on the manufacturer side?

Keep it simple is the demand which was given by the creators of the 3rd platform. However, in the enterprise of global customers the IT still runs in the 1st or mostly on the 2nd platform. This has to be taken in considerations. Associations like SNIA take this as basis and define standards which span the bridge between the new innovative and the current business demand IT requirements. Industry standards take a while to establish and of the manufacturers develop there own “standard” to keep customers closer. In the early stages of new technology this many be very convenient, but with maturity of the approach the move to an industry adoption will become necessary.
The same happened currently in SNIA. The standards around the traditional SAN is defined and also management capabilities like SMI-S are defined and adopted. New areas like Big Data, Object Storage, Analytics and Flash are more in the definition phase and the manufacturer define there own strategy and API´s. SNIA is working deep with the vendors to get the industry better shaped here. Seeing still a lot of startups appearing and innovation happening there too, the industry standard definition is still in the starting points, but customers would be wise to ask for certificated products to make the TCO and technology adoption more efficient to serve not only the cost model also the new demands of their business.

Madelbrot by Frax

Madelbrot by Frax

 

 

 

 

 

 

 

 

 

 

 

 

 

Enhanced by Zemanta

The MS-Dos Phase Of BigData


The first developers of IBM PC computers negle...

The first developers of IBM PC computers neglected audio capabilities (first IBM model, 1981). (Photo credit: Wikipedia)

Most recently I was invited to talk to various CIO´s in the German context. The focus of my presentation was about the next years and what to expect from society, how IT changes everything, how to prepare and what to educate the employees on. IDC talked about the concept of the 3rd Platform. Thinking about the BigData hype while preparing the day, I was reminded on my times with MS-Dos. There was not only one approach to the “PC-Aera”. There was DR-Dos, Apple, still Mainframes etc.

One of the reasons MS-Dos gets more attention was the affinity to IBM and the combination of a device with a market place at this time. Also IBM let others open the space to develop software, add-ons for the PC etc. This generated momentum.

Seeing back the development to Windows 7, has taken a close combination of physics, like Intel, Software, like Microsoft and a change in the market.

The same we currently experimenting with cloud, virtualization and the future stage of an Software defined Datacenter, I guess we have long passed the MS-Dos phase.

So what is the MS-Dos phase?

Plenty of players, not a defined market, overselling of functionality, misleading abbreviations and very poor educated experts, best are from the vendors itself.

With all my talks to the CEO´s, CTO´s of industries, it has the same questions open:

  • What can I do with this technology?
  • Who can help me to build out the business context? and
  • Is this already time to invest into?

There is a clear answer, it depends !

It depends on the maturity of the IT department:

  • are they still maintaining the IT or do they drive innovative, business related IT processes.
  • Where is the infrastructure? Is the IT department still deploying Storage, Network and Servers, or they running a converged infrastructure with a services layer?
  • Do IT departments still focus on education on infrastructure or they hiring Data Scientists ?
MD Tag: MS DOS

MD Tag: MS DOS (Photo credit: shawnblog)

If you can answer this for your IT department, you are much closer to leave the MS-Dos phase behind you.

In the past, many business models have been established and run for years. This will be no longer true in many traditional businesses. Also this business is often in MS-Dos phase, but it will adapt thru the new market drivers, consumeration, urbanization, mobility, Maschine-to-Maschine decision processes and new adaptive computation processes which adjust the business to the demands of the users faster than realtime.

New concepts are unpredictable, like crowd sourcing, open source, raspberry pi physics, and machine learning. There is only one big truth, from MS-Dos to OS X it will take not 20 years, to will happen tomorrow.

Enhanced by Zemanta