IoT – Chapter I: The Things in the Internet and the connection


Why IoT now ?

In the not so young past the miniaturization has cause a variety of new concepts to drive the markets. This was introduced to many markets. In the IT it is called, and was originated, from the move to softwarization of HW. Yes that is the main driver of the digitalization. Was in the 90’s the key on CNC productions, process automation,

People around the world

(c) by presentation load

improving the way humans work and it turned over to the 00’s in more focus on Software. With the concept of Open-Software and mass development, more high layer of software development open the door to connect devices. Not to mention the standards which are born in the IT side, like IP and the ISO models led to a harmonization of the infrastructure. Since around three years the trend of a unique backbone, we call it IT infrastructure, which consist out of HW/SW/Middleware and is transparent to the development of software cased the opening of a window to more and more connecting of different devices. Still many questions are not yet answered, like security, ownership of generated and collected data, responsibility, the governmental, profiling u.o. However one of the burning and open questions in the OO’s was solved with the introduction of HADOOP and MAP-Reduce concepts. Combined with the open software approach it defined the starting point of the big data area, or I would call it the era of information. Now the Internet of Things was born. With combination of the Google and Apple for mobile as well as Facebook and RedHat for Information collection and backbone OS the collected data was much more easy to personalize and enlighten with more data sources collected somewhere else. Of course there are still many companies opt-out of the sharing and implement own standards to hold on the proprietary formats of data. On the long run they will follow the market and join the collective, I am quite sure and the history has proven this more than one time.

What is IoT ?

The basics of Internet of things is the collection of data with uncounted of variations of devices. It can be from machine, human body sensors, cars, trains, databases, watches, sensors on environmental, mobile devices  among many others possibilities. Smart homes and smart cities are the next steps which the industry currently targeting.
In addition to the devices and island of data producing entities the Internet of Things is marked thru the collection and combination of the data. These are stored in various new concepts of DB’s. At the end the key is to drive new insights out of more data. So in essence the Internet of Things generate facts from the billions of sensors and combination with other data stored in the Internet.

What is than Industry 4.0

Under the term Industry 4.0, the digitalization of the process automation industry is often mentioned. From my point this is only a small fraction. The Industry 4.0 describes all attempts to digitalize the processes in the industry. This covers retail,  manufacturing, connected cars, aerospace, transportation as well as healthcare and other industries. The focus here is not only to improve the quality of the data, it drives also a totally new way to go to market: The as a service approach. This means that the companies no longer supply the offerings as a one off, more as a constant service. It is like the telephone companies do already in there business models. Other companies now driven by many startups to do the same in the Business to Business (B2B) efforts.

from http://fabiusmaximus.com

Future is Today

What is next?

The digitalization will not come to an end. It will consume more and more industries. Like the mathematicians would say „The World is build on numbers and functions“ This process will drive IT and and Information skills into business units. The traditional IT departments will be commoditized and often move to a provider. I see that happening in the next 10 years, like it happened to the telekom models in the late 90’s. Today no large enterprise is running his own telecommunications department. It is all integrated in the IT. In the next 5 years all non critical systems will be run by a specialized provider. Global and large enterprise have to understand the impact of open software, the information and data science. With the sharing of information and connection of things security w

ill be a critical asset to understand. Information and data will be assets which will find the way on the balance sheet.
In the next years it will not be the question what, more the question why the companies need still own IT

 

The new IT Landscape


A view from the Infrastructure

In the last 15 years the Infrastructure landscape was defined by demands of the business. This will of course not change. However the approach that one business line demands middleware X another middleware Y will stop. There is a profound reason for that.

In the last couple of years the physic run the Infrastructure has dramatically comodiasied. This has reach a point where the saving for large enterprise no

Featured image

longer get in significant dimensions. The efficiency thru Server Virtualization and nowadays Storage Virtualization has reached in some enterprises more than 80%. With new storage and server orchestration layers and additional concepts like the enterprise hybrid cloud (EHC) this can be tweaked more, but needs first a different approach to the IT operation.

Key here is private cloud, which is similar to the public could offerings, of course on premise.

So what is the catch?

Mainly the operation. In the traditional datacenter, many enterprises and global operational IT departments have build a structure to map the silos approach of the LoB (Line of Business). You will find functions focused on Server, Storage, Networking, Databases, Middleware etc. Each of them have coordination functions with the LoB and cross functional sections. Lost of talks I have with those entities in the IT department always claim that they can do that better than external companies like VCE, which offers converged Infrastructure. Also many of them hide behind the “vendor-lock-in” argument.

On the other side we see that this cost the companies a fortune. Often this IT departments cover 70% of their cost with this, or the other way they can save a lot of that.

What has changed ?

With the concept of “as-a-service”, IT has the ability to automate many tasks and build a software layer as the final governance. With new concept of SLA build into the Software defined components IT personal no longer has to pan, define think-about and run it. Combined with the Converged Infrastructure and the possibilities of Software defined it changes the silos approach to an more holistic view of the datacenter. This does not only save cost and transport test and development of the infrastructure back to the vendor, it also allows higher integration of resources to drive more efficiency.

How does LoB react ?

Often they already there. With offerings of a public cloud the development of new software happens in this organizations often without the IT department involvement. This is a major concern of the CIO and CDO which I here very often. LoB´s look at the business outcome, they have alternatives to the internal IT now and they move off.

So what is next?

From ym view a lot will come in to analyze of the current state of the IT department and how mature this is already in the as-a-service transformation. There are various of offerings like the IT Transformation Workshop of EMC to define and reshape the IT landscape. Have a look at that.

So what with the applications?

Not so simple. There will be three types of applications found in many of the enterprises.

Applications which only deliver information, exist because of historical reasons. Others are monolithic large Enterprise Apps, like SAP, or Oracle Applications the thrid one are new apps for the new business lines touching Web, Mobile, social and cloud.IT-Transformation-Storymap1

For the first, I would retire them and replace that by a database delivering the results. Maybe there are apps no longer used, but nobody realize that? Shut them down. The 2nd kind is more tricky, and have to looked at case by case build a migration strategy and this may take some mont/years. The last I would put immediately on the new concept of Infrastructure.

So what is the key characteristics of this infrastructure?

Automation and orchestration, comodization and standardization. To drive more cost out of the IT the next generation of architecture have to follow this rules. More that that it has to build an independent layer between the physic and the applications. An interface between the resources and the applications. Efficiency and time to provisioning can be only gained with automation. Modern architecture drive provisioning down from weeks to days or even hours, defining the SLA and report back the cost of the selected SLA`s. Also it reports back whether a service breached the SLA or has performed in the payed and agreed parameters.

Finally all this journey start with the ability of the IT department to change and understand the journey of the private cloud.

Image courtesy of pro-physic.de, EMC Corporation

Read More:

http://itblog.emc.com/category/it-transformation/

https://blogs.vmware.com/cloudops/it-transformation

What kind of Analytics ?


In various discussion with my customers and colleagues. I experienced a very controversy discussion around analytic.

Some understand the „what has happens,” also called root cause, others want to predict the future, meaning „what will happen,” a third group tries to answer the question, „what could happen“. Funny enough often is the answer a combination out of a combination of these. When it comes to our very fast decision making and consuming society this can cause some friction.
In the first case, lets call it descriptive analytic, it is the reason many companies invest money to avoid that, but after the incident, not predicted, happens.2434048_300dpi

 

 

Often you find it difficult to get to the root cause, since simple the data or information is no longer available to get to the bottom. In the 2nd case, statistical model show the business based on historical information what can get wrong, lets call it predictive analysis. In the last case, data scientists use machine learning to find the possible future, based on different information streams, it is known as prescriptive analytic. Like in chess where in the first move all possible is and there some prediction could be made, based on the information of the players, over time when more moves have taken the information and moves getting clearer.

 

The human brain is capable of doing this kind of prescription on the fly, we call that experience. By the way a reason why often young managers or entrepreneurs fail, but this is worth another blog. The key is here that the brain has build, by and expert, enough information and channels that the brain could do both predictive and prescriptive.
Having the scientists build the model, means the mathematical representation of the information combination, it can lead to a variety of possible outcomes, we call the data science or machine learning. Key is the the amount of information, it deep and historical very long. This is also on of the boundaries.

 

Many CIO and CTO from companies I talk today do not keep the data, for many reasons, mostly cost. So what can a Data Scientist than do ? Simple he build the model and than run it. Over the time it will be better, preserving the information. Like in the chess where you can see the extreme professional players analyze and predict movements in depth untrained could not imagine , or even provoke movements. This is not yet in this science embedded.

 

A couple of week ago I was fortunate to see a start-up and talk to the founders. It was all about customer intimacy and combining information in a company to serve a better customers relationship. I would drive this one dimension deeper. Why not using this as an internal knowledge base. Let´s predict what the corporate user needs to find around a specific subject, not only internal sources also external can be used here.

 

I am still wondering why HR representatives and headhunter not use predictive analysis to identify the right candidate for a job?
Yes you’re right the information in a company is often so filed that we will not have the possibility to combine them. Here we go with the concept of a data lake, building one repository of information and use it for internal and external benefit.

prop-jet engine

However the last topic is the „what could happen“ case, the descriptive Analytics. This is all about possibilities and often risks. I guess this is used in Six Sigma, and also man kind is used to do that. The mother warns the child about the possible accident, when walking on the street. The company strategic plans is all about that scenarios. I think this method means „I do not have enough data to be more precise. „

 

So what to do? Keep all the data, maybe randomized and anonymized, but keep it, because you never know what business can be build out of the treasure the company has generated over years.

 

Welcome in the Age of Information !

 

 

 

 

Why Standards matters


Do we really need standards in infrastructure?

Mandelbrot, Designed by Frax for iPAD

Mandelbrot by Frax for iPad

Since a couple of years I have discussions with CIO`and other technology employees in global companies around standards. In fact the introduction of SAN as a protocol standard enabled the largest consolidation and optimization in the DC, started early 2000 until now. One of the reasons is that there has to be a maturity of the market and also a common sense that there is no benefit of defining the “company” standard.
As we have seen in the telco industry, over years companies have define there own phone infrastructure, it has total changed now to a VOiP driven model with a lot of benefits. Same we see nowadays in the DC with the appearance of the converged infrastructure. Hence this approach is not yet mature enough to have one single standard and the manufacturer of infrastructure and VAR´s define there own way. This currently has major impacts at customer sites since there will be a next generation converged model with is more defined thru organizations like SNIA and open stack models, or it will be a 3rd platform approach more likely, which may be called cloud.
What ever the next years will bring the key here is that decisions are not driven only on price more on the TCO and the future of the proposed architecture.
It is obvious to see that innovations cycles will increase and the demand for more flexibility on the business side with drive different infrastructure needs.

What happened on the manufacturer side?

Keep it simple is the demand which was given by the creators of the 3rd platform. However, in the enterprise of global customers the IT still runs in the 1st or mostly on the 2nd platform. This has to be taken in considerations. Associations like SNIA take this as basis and define standards which span the bridge between the new innovative and the current business demand IT requirements. Industry standards take a while to establish and of the manufacturers develop there own “standard” to keep customers closer. In the early stages of new technology this many be very convenient, but with maturity of the approach the move to an industry adoption will become necessary.
The same happened currently in SNIA. The standards around the traditional SAN is defined and also management capabilities like SMI-S are defined and adopted. New areas like Big Data, Object Storage, Analytics and Flash are more in the definition phase and the manufacturer define there own strategy and API´s. SNIA is working deep with the vendors to get the industry better shaped here. Seeing still a lot of startups appearing and innovation happening there too, the industry standard definition is still in the starting points, but customers would be wise to ask for certificated products to make the TCO and technology adoption more efficient to serve not only the cost model also the new demands of their business.

Madelbrot by Frax

Madelbrot by Frax

 

 

 

 

 

 

 

 

 

 

 

 

 

Enhanced by Zemanta

My top 10 anticipations for 2013


The cloud will emerge in all parts of the IT

More services have been emerged in 2012, the adoption will drive more cloud, and more cloud offerings will parer. This circle will speed-up and drive more Enterprises in evaluating the next private cloud.

bigdata_network

The engine will be build in the manufacturing not at customer premise

In many enterprises 2012 has started a evaluation of converged infrastructure. Since most of the used components are standardized the „build on customer site“ or „do-it-by-myself“ will be more and more questioned by the CFO. Like in the past where whole servers where build by locals or enterprises itself, it is obvious that the purchasing department will have a different look when it comes to TCO. This will also start to include backup and security.  The order will be moving to workloads and software demands rather than core´s and PB of storage or Network interconnects. The VCE model will become the standard delivery methods for modern DC architectures.

Consummation of IT will drive more business adoption in the cloud

The DC will have more direct contact with the end customers. This was in 2012 one of the main drivers, build thru Google/Android and Apple/iCloud. This will move faster. The new paradigm enables enterprises to optimize cost and optimize the business model when direct talk to the end-customers. In 2013 we will see that this shifts also into computer-to-computer relationships. The interaction of b2b mash will lead to faster purchasing and information flow to optimize the business, and early adopter will be the winners. Key here is an agile IT infrastructure.

Cloud will feed BigData, BigData will enable more cloud

Each one is depended from another. With more cloud, the information streams are better to combine for BigData Analysis, with more BigData analysis and applications, more agile computing environments (called cloud) are necessary. This will be a big trend, but depending on the industry. New businesses will learn to use the information streams, new forms of analytics will be generated to enhance the processes and decision making progress.

Flash will be leading the storage industry

The last years flash was used as replacement for spinning disk. This has a huge impact in performance and utilization of computing power. Man-years of brain power where spent to develop smart algorithms to leverage the new TIER in the storage. This will continue to evolve and with the drop in price and increase of capacity the market will grow dramatically. However flash has not only be squeezed into  spinning disk, it also allowed to transform and implement new algorithm´s to utilize the power of flash. When IO´s no longer the limiting factor, CPU power can be used to leverage the performance in the storage system to do much more tasks which delivering high performance to the server´s.

The CIO will refocus on the core: the Information

With all the changes in IT, the transformation of the roles in the IT management will not stop. Since the DC gravitate to the information it is obvious to see that the CIO will be the master of the information. Not only the information processing in the own DC also the information needed form outside, like BigData applications, also informations given to outside, i.e. to customers. New technologies like the app horizon manager from VMware will support this transition and the CIO and team will be transformed into the information broker, security agent and consultant for the business lines.

Standardization will enable more productivity

One aspect of going from private cloud to hybrid is the standardization. Many companies, like Microsoft, Google or Amazon u.o. define the API and push them as de-facto standard. Experience showed that the early adopter often drive this, but run against a wall in a couple of years. Ethernet/IP and FC would not have been so  broadly accepted if there was not a standards body formed. We see currently various associations to  take on the role in the cloud, like the SNIA organization. This is the only way to help the DC out of the „do it by my self“ and focus on more business relevant tasks. The engine (converged infrastructure) will be developed and assembled in the vendors premise and the Enterprise DC managers can focus on the utilization.

The storm on clouds will drive orchestration

When, in the past, virtualization was introduced to customers, the VM landscape was able to be controlled by man kind. It is similar to the SAN in the early 2000´s when the storage array was still close to a server. This will continue to chance, when we will see 10.000 of VM´s in modern DC architectures. The orchestration will lead to the agility which is necessary to drive more business flexibility. Vendors, led by VMware,  will provide here more sophisticated solutions to automate.

Keys.

Keys. (Photo credit: Bohman)

Security and Trust will be in the middle of everything

Since Information is the key, like it was in the past but now it is a more open world, to secure is one of the key elements. The business will ask for answers, and companies like RSA will lead the way. Not only to secure also to trust other organizations is essential. With new regulations and demands of the business information has to be more trustworthy.

InMemory systems will draw investments from infrastructure

Since InMemory systems show more demand on the „main“-Memory than Hi-IO rates, it will re-architect the infrastructure in this area. 2013 will show if this new technologies will add new fields of applications or replace other. Technologies like GemFire, Hana u.o. will drive faster decision making and new infrastructure architectures. Combined with flash companies like SAP and EMC will drive the Industry here.

Enhanced by Zemanta

Keynote at the VMword 2012, with Pat Gelsinger and Steve Herrod


VMworld Barcelona

VMworld Barcelona 2012

Pat Gelsinger is on stage in is new role as CEO of VMware

Pat starts with an back view of the history. Waves of change has been the only constant at the IT industry. Led by IT innovations and driven thru technology. He talks about phases of IT maturity, reactive vs. proactive.
The transformation is in each layer starting with consumer, IT departments, people, operations and procedures. It began in various layers simultaneously and he talked about the infrastructure layer first of course.

From server to cloud.

This means from 25% in 2008 server are virtualized, today it is around 60%. The expectations that it will be in the near future more than 90% of all workloads virtualized.  This has also impact on the provision of servers from week, to days and in a view years to minutes and seconds or less. This only can be done by introducing a new paradigm automation

Next topic: software defined datacenter
Introducing the VMware perspective. Means that it is everything virtualized. Pat starts to talk about a huge legacy in the DC of today; the only solution here to end the dilemma is to abstract, pool and than finally full automate. Leading to the concept in manufacturing on „Just in time“ or all as a service.
The Software defined Datacenter is based on the vCloud suite, the basic element of the beginning of the journey.

Pat Gelsinger on vCloud Suite

This suite it is comprehensive, deliverers the highest performance, proven reliability, check it out.
Pat announced now that vRAM will no longer exist. based on the customer feedback VMware has decided that this is no longer the way to go. The new model is

  • priced per CPU
  • one easy solution
  • has no limitations

Back to the Software defined datacenter, diving into the management philosophy, more automation, less management, which will lead to IT broker of services also mean that the management must change.
The  policy based automation is the key to manage the future datacenter, eliminating time and human errors in the equation. This equals
service provisioning including vCloud Automation, vFabric Application Director,
operation management including vCenter Operation management suite,
Business management including IT business management suite.

The CEO now talks about Cloud Infrastructure around software, technology and architecture. This leads to the next area: How do I operate in the new World? its about People, culture and org, as well as Process and control and IT Business management
A new forum is build to shape the ecosystem, call „cloud ops forum
Moving on to multi cloud environment, like vCloud, physical, non/VMW and public.
Pat mentioned  PaaS is key represented thru cloudfoundry and automatic service provisioning DynamicOps, and software defined networking and security, keyword is nicira.

Addressing the multi cloud world
The VMware executive claims that it is ready for the open world.
Next high light is on applications, to move apps into new dimensions from vFabric on application transformation.
This all has been only be able to drive this transformation it was only be able thru the huge partner system.
Pat now moves the mic over to Chief Technology Officer Steve Herrod.

Future Storage Directions

Steve start to talk about the next generation  vCloud suite, and start to dig into. VMware, vSphere, virtualization, focusing on Exchange, share point server and SAP as well as Oracle.

What is a Monster VM, drive this to the edge.

  • 2011: 32 vCPUs, 1 TB per VM and IOPS 1 M per host, 
  • 2012: up to 64, 1 TB and more than 1 M IOPS per host.

So what about tomorrow’s applications, So what are they, in Telco, VoIP issues is latency, VMware is looking into that , based on vFabric, they look into the shared virtual memory, also HPC is coming to to virtualization, PaaS like cloudfoundry, introducing more on the Hadoop integration.

Enhanced by Zemanta

Learn from the past, think about the future, but live in the now !


When I prepared for my last tests at the University, one of my professors mentioned that they intend not to guide us to find good jobs immediately after the University, the key is that we as graduates have longer term chances to find and keep excellent jobs. At this time I thought it was stupid, my profession starts with my first job and not in 10 years.

Die of an Intel 80486DX2 microprocessor (actua...

Die of an Intel 80486DX2 microprocessor (actual size: 12×6.75 mm) in its packaging. (Photo credit: Wikipedia)

Time showed to me, that it is key to have both in mind when you start and it is essential to stay laser sharp focused.

Same in datacenters, when CIO´s get confronted with the new reality of BYOD, data deluge, cyber attacks and complexity explosion. Positions and processes years worked no longer seem accurate, personal gets confused, and the business units claim to have often more IT knowledge, since they use flexible services on the Internet. The CIO´s and IT managers I talked to often seek for external help, call consultants, which tell them that all has to change, processes has to be reworked and the HC has to decrease.

What a challenge!

Most people underestimate the finest attitude mankind has is: the best ability to change and adapt. Of course this is not easy, people like to be in the comfort zone, don`t like to get moved or learn new skills, move on and modify, even they do this all their lifes.

Most successful IT organizations I know have arranged with that to a superlative, taken the employees on the road and established a culture of change. As we learn from the past the change never stops, we moved from monolithic mainframes to mini´s to server to PC and now to tablets; from proprietary OS to open source from complex instructions CPU´s to RISC and back, from multiple CPU architectures, to multiple server to multiple core architectures. From ASCII terminals to mouse driven interactive PC´s to gesture tablets. From 640 KB RAM to Tb of RAM, from 5 MB Disk drive to 4 TB and now flash technology. So we know the past, we understand that telecommunication 10 years ago has been an enterprise asset and now is an outsourced, VOIP driven business.

So what will the future bring to plan, new devices, like the Google glasses, faster storage, from keyboards to voice driven interaction with computers, to InMemory DB build in CloudOS, from Applications to functions, from physical infrastructures topologies and silos, to software defined datacenters. From Application centric approach to information centric concepts, from blade farms to virtual workloads which independent from the blades will be executed?

We will see computer talk to computers, make decisions for us, devices will generate information, which only relevant for seconds. Batch processing will become too slow to keep pace;

This image was selected as a picture of the we...

This image was selected as a picture of the week on the Farsi Wikipedia for the 13th week, 2011. (Photo credit: Wikipedia)

it will be replaced thru bigdata algorithms.

On the consumption model we will purchase only what we consume, vendors have to deal with end customers enterprises will move legacy workloads to specialized and focused workbenches, all information has to be secured and trusted to be transferred.

So the key for the future will all about information, generates, capture, store, manipulate, destroy and analyze, we call it the gravity is on information.

Based on that, CIO´s and IT managers have to and do act in the now, to prepare. Build an agile infrastructure, make our investments in the Datacenter on skills, new technology, information and big data, that supports the workload management and secure the information, which this is critical to the enterprise or the privacy of an employee or customer. Investments in IT of an enterprise should help to build and prepare an agile datacenter around the information to be ready for the near future and bejond.

Enhanced by Zemanta

The DataCenter Evolution


In the last decade the DC has taken major reworks and design changes. The last years the evolution was marked thru the revolution on the market. Disruptive technology shifts and new classes of devices have changed the demands of a classic DC. Still most of the money is spent to keep the lights on. On average only 25% of the budgets of a datacenter is kept to new business and improve the situation.

The DataCenter

(c) by presentermedia

So what will happen ?

First of all there is not one answer to this question, but we can try to look into the future challenges of IT businesses:

The world will continue to generate huge amounts of informations. Consumeration of IT will continue at a faster pace, with new devices appear. Regulations will change to adapt the new reality of IT, like privacy by default etc. Realtime of Information will overcome, data collection for later processing Security and Trust alliances will be more necessary since Information is more volatile.

In 1999, when I experienced the first SAN implementations many CIOs told me that there will be a limit on data storage since the complexity become too much to handle in the DC with two or more networks. Looking back the technology vendors and IT departments have established a quite good understanding how to train and build out teams of architects for the SAN Networking in parallel with the IP Networks. Now the complexity in DataStorage raises and DC managers think about the way to deal with that. Future will tell, but my view is that the networking will come together with the SDN concepts where the physical network is decoupled from the logical view. In this case the SAN and IP can be managed from one point. Further more with the appearance of the Virtualization the future of the ressource management will be come together. When observing the technology development and the industry demand on information management, the cloud burst and the consumeration of information management the way leads to a generic ressource in IT which is assigned a task. In otherwords, why having a storage device, a computing device or an networking controller or even a PC. it is defined by the demand. There will be an efficient device for each task in the first step, but I strongly believe that in a couple of years the physic will no longer be the architectural definition of the DC. It will be the software.

The Earth

(c) by NASA

So why would a CEO today think about putting money into an own DataCenter or IT-Department ?

Most probably to gain advantage of the competition in having applications bestir customized to the business process. Today I found very interesting that CIO´s get asked by the business to challenges with the rest of the world. It is hard for them since the internal BU´s often not compare fair. The “good enough” principle will only work if this is also internal applicable.

Security demands, internal processes and other guidelines drive the IT-department often in additional cost.

So the question should not be “do I need an own DC of IT-department”, it should be the question about responsibility on the information. There is a reason why the DC is called Datacenter! and the CIO os call “Chief Information Officer”, not “Chief Datacenter Officer”. In this context, it is obvious that the DC no longer is necessary in physical form to exist, it is more the collection of services and Information flow the responsible manager should look for.

Finally the datacenter will become the Touring Maschine !!

Enhanced by Zemanta

Running the most powerful iPAD


VMware View

In the last years the consolidation on one hand and the standardization on the other has lifted the expectations of the users to always on/endless power. Even compute intensive applications moved into the cloud. With the challenge of the  “multiple” device user experience many consumers of IT experienced the lack of integration. So applications like “dropbox”, “box.net” or even the project “Octopus” from VMware have been generated to consolidate information on one single source.

Enterprises have troubles with that since there is a layer of trust and control witch may leave the ship.  So to take in account that we use more than one device, depending on the personal lifestyle and the expectations from culture or business to be online more often, the devices need to be “always-on”, trusted and reliable. 

We call this user-experience, which VMware has taken in account and developed the virtual desktop infrastructure (VDI). This let run you desktop on a server and “beam” the screen to your device.

 

 

 

There comes a lot of advantages with that.

  • First and foremost, it can run all the time. There is no need to shut down at all. I am not talking about sleep, it runs. This implies that Outlook rules i.e. will be executed always.
  • Second, it is fast. Since all the VDI desktops run on the same HW or close communication is fast. Resources can be shared, Infrastructure is no longer carried away when the user is inactive. Others can be leverage the remaining power.
  • Third, administration can be done much more easily. Since I used the VDI for business, there was at least one incident where our AntiVirus provider has sent a corrupted update, which leads to a block of most of the laptops. Also the VDI was affected. However the VDI has been shut down, reinitialized and than rebooted to recover. More than 1000 virtual laptops has been restored in minutes with involving 2 employees. The physical exchange of the profile has taken days and involved most of the IT specialist in the field.
  • Forth, security is key in enterprises, even the VDI is not free from fraud, at least it can be better controlled by the admins and security patches are installed in one big shot.
  • Fifth, backup it is all in the DC, most of the information is redundant, so de-duplication like Avamar or DataDomain can be much more effective, which also restore is fast, since it happens in the DC.

So what is the catch? In my experience with the VDI it is hard for offline travelers and on very low latency connections.

 

To use it for my daily work enables me to use my apps like Outlook, PowerPoint salseforce.com on my iMac, iPAD and MacBook, at the same time, no boot time delay, easy access and 100% support of the IT department.

Even my iPAD is not the best device for producing work it can leverage all the legacy application I have to use and provide me thru the PiggyVDI the best infrastructure to run my desktop: an vBlock from VCE.

Many IT experts I have spoken see that VDI is the key for BYOD and demands of an Enterprise, but the infrastructure has to be flexible enough to compensate for the enterprise demand resources, since it is not easy to predict the amount of desktops running at the same time. This is not exactly true since VMware and VCE has good models and experience to design this.

 

My strong believe is that with the user centric IT, VDI will be the future of enterprise desktop management and it will deliver the power of the enterprise to the User´s device, quick, easy and reliable.

Related: 

VMware View demo on iPAD

Was Spock the first Data Analyst?


The last couple of years a discussion around the information society has started. Since more people enter data around their lives as well our planet, it was obvious that business start to leverage this trend and added more data; like Google scanned the library of Congress, mapped the planet including the oceans. Nowadays this is topped. Data combination and new streams of information are provided, some free some for purchase.

Now in the Star Trek series the chief scientist, called Spock, has the task to gather as many as possible data streams delivered on the starship, combine it with the knowledge of a huge computer library and dr

Vulcan (Star Trek)

Vulcan (Star Trek) (Photo credit: Wikipedia)

aw conclusions of it, in real-time. In this series the logic and an ability to draw fast conclusions for the captain to make relevant decisions where key for survival.

In modern business the survival of companies depend on fast, exact, agile conclusions.  Modern technologies like the Chorus a product of Greenplum enables businesses of all sizes to gain insight on markets, customers, competition etc. was it in the past that this could be done on a long time frame today’s businesses move toward a continuous optimization and adoption of the GTM and their portfolios.

To enable an agile business process leader of companies have to gather as many streams of data around your business combine it with insight knowledge and make the tough decisions.

Specialists that turn this data into information where decisions will be drawn from so called data analyst, while the identifying of relevant data streams out of white noise is the job of a data scientist.

Of course today the time between analyze and decision-making is not quite short like it was often at the Enterprise, but the trend of more and faster data generation as well as access, more agile business grow as startups and compete with the established ones.

Looking on trends in IT departments of enterprises of all kinds, the desire for more agility leads to a cloud approach. This is only the first step, the last state is to be in the middle of the data universe and navigate their Enterprise thru the business solar system. The input will be overwhelming, new processes for sensors input needed to be developed and the crew aligned to the new command structure.

The engineering section, we would call it infrastructure, has to offer flexible and agile systems to answer the requests fast and right. One Key to success is automation, orchestration and standardization, and not dictation and a silo approach. Scotty will most probably fit into a data scientist role.

Star Trek: Phase II

Star Trek: Phase II (Photo credit: Wikipedia)

CIO´s will more become like captains to understand the challenges in this new space and align the crew and the rest of the ship to needs of the next decade. When cloud computing is the engine for agility, Big Data is the survival kit for the enterprise in the future. So Spock and Scotty are the two main assets of modern Enterprises  and James T. Kirk has drawn the right decisions from them, always.

Enhanced by Zemanta