Keynote at the VMword 2012, with Pat Gelsinger and Steve Herrod


VMworld Barcelona

VMworld Barcelona 2012

Pat Gelsinger is on stage in is new role as CEO of VMware

Pat starts with an back view of the history. Waves of change has been the only constant at the IT industry. Led by IT innovations and driven thru technology. He talks about phases of IT maturity, reactive vs. proactive.
The transformation is in each layer starting with consumer, IT departments, people, operations and procedures. It began in various layers simultaneously and he talked about the infrastructure layer first of course.

From server to cloud.

This means from 25% in 2008 server are virtualized, today it is around 60%. The expectations that it will be in the near future more than 90% of all workloads virtualized.  This has also impact on the provision of servers from week, to days and in a view years to minutes and seconds or less. This only can be done by introducing a new paradigm automation

Next topic: software defined datacenter
Introducing the VMware perspective. Means that it is everything virtualized. Pat starts to talk about a huge legacy in the DC of today; the only solution here to end the dilemma is to abstract, pool and than finally full automate. Leading to the concept in manufacturing on „Just in time“ or all as a service.
The Software defined Datacenter is based on the vCloud suite, the basic element of the beginning of the journey.

Pat Gelsinger on vCloud Suite

This suite it is comprehensive, deliverers the highest performance, proven reliability, check it out.
Pat announced now that vRAM will no longer exist. based on the customer feedback VMware has decided that this is no longer the way to go. The new model is

  • priced per CPU
  • one easy solution
  • has no limitations

Back to the Software defined datacenter, diving into the management philosophy, more automation, less management, which will lead to IT broker of services also mean that the management must change.
The  policy based automation is the key to manage the future datacenter, eliminating time and human errors in the equation. This equals
service provisioning including vCloud Automation, vFabric Application Director,
operation management including vCenter Operation management suite,
Business management including IT business management suite.

The CEO now talks about Cloud Infrastructure around software, technology and architecture. This leads to the next area: How do I operate in the new World? its about People, culture and org, as well as Process and control and IT Business management
A new forum is build to shape the ecosystem, call „cloud ops forum
Moving on to multi cloud environment, like vCloud, physical, non/VMW and public.
Pat mentioned  PaaS is key represented thru cloudfoundry and automatic service provisioning DynamicOps, and software defined networking and security, keyword is nicira.

Addressing the multi cloud world
The VMware executive claims that it is ready for the open world.
Next high light is on applications, to move apps into new dimensions from vFabric on application transformation.
This all has been only be able to drive this transformation it was only be able thru the huge partner system.
Pat now moves the mic over to Chief Technology Officer Steve Herrod.

Future Storage Directions

Steve start to talk about the next generation  vCloud suite, and start to dig into. VMware, vSphere, virtualization, focusing on Exchange, share point server and SAP as well as Oracle.

What is a Monster VM, drive this to the edge.

  • 2011: 32 vCPUs, 1 TB per VM and IOPS 1 M per host, 
  • 2012: up to 64, 1 TB and more than 1 M IOPS per host.

So what about tomorrow’s applications, So what are they, in Telco, VoIP issues is latency, VMware is looking into that , based on vFabric, they look into the shared virtual memory, also HPC is coming to to virtualization, PaaS like cloudfoundry, introducing more on the Hadoop integration.

Enhanced by Zemanta

Learn from the past, think about the future, but live in the now !


When I prepared for my last tests at the University, one of my professors mentioned that they intend not to guide us to find good jobs immediately after the University, the key is that we as graduates have longer term chances to find and keep excellent jobs. At this time I thought it was stupid, my profession starts with my first job and not in 10 years.

Die of an Intel 80486DX2 microprocessor (actua...

Die of an Intel 80486DX2 microprocessor (actual size: 12×6.75 mm) in its packaging. (Photo credit: Wikipedia)

Time showed to me, that it is key to have both in mind when you start and it is essential to stay laser sharp focused.

Same in datacenters, when CIO´s get confronted with the new reality of BYOD, data deluge, cyber attacks and complexity explosion. Positions and processes years worked no longer seem accurate, personal gets confused, and the business units claim to have often more IT knowledge, since they use flexible services on the Internet. The CIO´s and IT managers I talked to often seek for external help, call consultants, which tell them that all has to change, processes has to be reworked and the HC has to decrease.

What a challenge!

Most people underestimate the finest attitude mankind has is: the best ability to change and adapt. Of course this is not easy, people like to be in the comfort zone, don`t like to get moved or learn new skills, move on and modify, even they do this all their lifes.

Most successful IT organizations I know have arranged with that to a superlative, taken the employees on the road and established a culture of change. As we learn from the past the change never stops, we moved from monolithic mainframes to mini´s to server to PC and now to tablets; from proprietary OS to open source from complex instructions CPU´s to RISC and back, from multiple CPU architectures, to multiple server to multiple core architectures. From ASCII terminals to mouse driven interactive PC´s to gesture tablets. From 640 KB RAM to Tb of RAM, from 5 MB Disk drive to 4 TB and now flash technology. So we know the past, we understand that telecommunication 10 years ago has been an enterprise asset and now is an outsourced, VOIP driven business.

So what will the future bring to plan, new devices, like the Google glasses, faster storage, from keyboards to voice driven interaction with computers, to InMemory DB build in CloudOS, from Applications to functions, from physical infrastructures topologies and silos, to software defined datacenters. From Application centric approach to information centric concepts, from blade farms to virtual workloads which independent from the blades will be executed?

We will see computer talk to computers, make decisions for us, devices will generate information, which only relevant for seconds. Batch processing will become too slow to keep pace;

This image was selected as a picture of the we...

This image was selected as a picture of the week on the Farsi Wikipedia for the 13th week, 2011. (Photo credit: Wikipedia)

it will be replaced thru bigdata algorithms.

On the consumption model we will purchase only what we consume, vendors have to deal with end customers enterprises will move legacy workloads to specialized and focused workbenches, all information has to be secured and trusted to be transferred.

So the key for the future will all about information, generates, capture, store, manipulate, destroy and analyze, we call it the gravity is on information.

Based on that, CIO´s and IT managers have to and do act in the now, to prepare. Build an agile infrastructure, make our investments in the Datacenter on skills, new technology, information and big data, that supports the workload management and secure the information, which this is critical to the enterprise or the privacy of an employee or customer. Investments in IT of an enterprise should help to build and prepare an agile datacenter around the information to be ready for the near future and bejond.

Enhanced by Zemanta

The DataCenter Evolution


In the last decade the DC has taken major reworks and design changes. The last years the evolution was marked thru the revolution on the market. Disruptive technology shifts and new classes of devices have changed the demands of a classic DC. Still most of the money is spent to keep the lights on. On average only 25% of the budgets of a datacenter is kept to new business and improve the situation.

The DataCenter

(c) by presentermedia

So what will happen ?

First of all there is not one answer to this question, but we can try to look into the future challenges of IT businesses:

The world will continue to generate huge amounts of informations. Consumeration of IT will continue at a faster pace, with new devices appear. Regulations will change to adapt the new reality of IT, like privacy by default etc. Realtime of Information will overcome, data collection for later processing Security and Trust alliances will be more necessary since Information is more volatile.

In 1999, when I experienced the first SAN implementations many CIOs told me that there will be a limit on data storage since the complexity become too much to handle in the DC with two or more networks. Looking back the technology vendors and IT departments have established a quite good understanding how to train and build out teams of architects for the SAN Networking in parallel with the IP Networks. Now the complexity in DataStorage raises and DC managers think about the way to deal with that. Future will tell, but my view is that the networking will come together with the SDN concepts where the physical network is decoupled from the logical view. In this case the SAN and IP can be managed from one point. Further more with the appearance of the Virtualization the future of the ressource management will be come together. When observing the technology development and the industry demand on information management, the cloud burst and the consumeration of information management the way leads to a generic ressource in IT which is assigned a task. In otherwords, why having a storage device, a computing device or an networking controller or even a PC. it is defined by the demand. There will be an efficient device for each task in the first step, but I strongly believe that in a couple of years the physic will no longer be the architectural definition of the DC. It will be the software.

The Earth

(c) by NASA

So why would a CEO today think about putting money into an own DataCenter or IT-Department ?

Most probably to gain advantage of the competition in having applications bestir customized to the business process. Today I found very interesting that CIO´s get asked by the business to challenges with the rest of the world. It is hard for them since the internal BU´s often not compare fair. The “good enough” principle will only work if this is also internal applicable.

Security demands, internal processes and other guidelines drive the IT-department often in additional cost.

So the question should not be “do I need an own DC of IT-department”, it should be the question about responsibility on the information. There is a reason why the DC is called Datacenter! and the CIO os call “Chief Information Officer”, not “Chief Datacenter Officer”. In this context, it is obvious that the DC no longer is necessary in physical form to exist, it is more the collection of services and Information flow the responsible manager should look for.

Finally the datacenter will become the Touring Maschine !!

Enhanced by Zemanta

Software defined Networks, The last mile for the infrastructure.


With technologies like storage virtualization, or server virtualization new topological and cost effective management of resources have been enabled. In recent history this has happened a disruptive IT operation change in most of the companies. The last missing point was Network Virtualization.

In the past networks where always defined thru protocols. These helped to drive the implementation in silicon, like it was in the processor a decade before. The other effect was that innovation was hindered since everything has to be aligned with the protocol standard.

However, with definition of SDN the protocol stack will be hidden behind the topology and new ways of networking can be archived. Like in Storage virtualization the innovation came with it.

                Source: slashdot

In the next years, the arguments around various concepts of virtualization will arise, and there will be more than one champion, but with the acquisition of Nicira, VMware will lead the way.   

EMC and VMware have now all components to enable Enterprise customer and SI´s to deploy their own stack of ITaaS ((I,P,S,M)aaS) to drive the private cloud and enable the hybrid approach.

With all the technology the key is to enable the business for more agility and diversification, faster GTM and more advanced offerings without IT dependence, we tend to call this of the BigData.

SDN opens the dimension for better data coverage, restful services and more integration of concepts without dependency of Hardware.

Enterprise can now continue the Journey, and leverage the resources from the infrastructure more cost efficient and agile than it was able in the past. For VMware it is the right step towards a true cloud OS.

Enhanced by Zemanta

Running the most powerful iPAD


VMware View

In the last years the consolidation on one hand and the standardization on the other has lifted the expectations of the users to always on/endless power. Even compute intensive applications moved into the cloud. With the challenge of the  “multiple” device user experience many consumers of IT experienced the lack of integration. So applications like “dropbox”, “box.net” or even the project “Octopus” from VMware have been generated to consolidate information on one single source.

Enterprises have troubles with that since there is a layer of trust and control witch may leave the ship.  So to take in account that we use more than one device, depending on the personal lifestyle and the expectations from culture or business to be online more often, the devices need to be “always-on”, trusted and reliable. 

We call this user-experience, which VMware has taken in account and developed the virtual desktop infrastructure (VDI). This let run you desktop on a server and “beam” the screen to your device.

 

 

 

There comes a lot of advantages with that.

  • First and foremost, it can run all the time. There is no need to shut down at all. I am not talking about sleep, it runs. This implies that Outlook rules i.e. will be executed always.
  • Second, it is fast. Since all the VDI desktops run on the same HW or close communication is fast. Resources can be shared, Infrastructure is no longer carried away when the user is inactive. Others can be leverage the remaining power.
  • Third, administration can be done much more easily. Since I used the VDI for business, there was at least one incident where our AntiVirus provider has sent a corrupted update, which leads to a block of most of the laptops. Also the VDI was affected. However the VDI has been shut down, reinitialized and than rebooted to recover. More than 1000 virtual laptops has been restored in minutes with involving 2 employees. The physical exchange of the profile has taken days and involved most of the IT specialist in the field.
  • Forth, security is key in enterprises, even the VDI is not free from fraud, at least it can be better controlled by the admins and security patches are installed in one big shot.
  • Fifth, backup it is all in the DC, most of the information is redundant, so de-duplication like Avamar or DataDomain can be much more effective, which also restore is fast, since it happens in the DC.

So what is the catch? In my experience with the VDI it is hard for offline travelers and on very low latency connections.

 

To use it for my daily work enables me to use my apps like Outlook, PowerPoint salseforce.com on my iMac, iPAD and MacBook, at the same time, no boot time delay, easy access and 100% support of the IT department.

Even my iPAD is not the best device for producing work it can leverage all the legacy application I have to use and provide me thru the PiggyVDI the best infrastructure to run my desktop: an vBlock from VCE.

Many IT experts I have spoken see that VDI is the key for BYOD and demands of an Enterprise, but the infrastructure has to be flexible enough to compensate for the enterprise demand resources, since it is not easy to predict the amount of desktops running at the same time. This is not exactly true since VMware and VCE has good models and experience to design this.

 

My strong believe is that with the user centric IT, VDI will be the future of enterprise desktop management and it will deliver the power of the enterprise to the User´s device, quick, easy and reliable.

Related: 

VMware View demo on iPAD

Was Spock the first Data Analyst?


The last couple of years a discussion around the information society has started. Since more people enter data around their lives as well our planet, it was obvious that business start to leverage this trend and added more data; like Google scanned the library of Congress, mapped the planet including the oceans. Nowadays this is topped. Data combination and new streams of information are provided, some free some for purchase.

Now in the Star Trek series the chief scientist, called Spock, has the task to gather as many as possible data streams delivered on the starship, combine it with the knowledge of a huge computer library and dr

Vulcan (Star Trek)

Vulcan (Star Trek) (Photo credit: Wikipedia)

aw conclusions of it, in real-time. In this series the logic and an ability to draw fast conclusions for the captain to make relevant decisions where key for survival.

In modern business the survival of companies depend on fast, exact, agile conclusions.  Modern technologies like the Chorus a product of Greenplum enables businesses of all sizes to gain insight on markets, customers, competition etc. was it in the past that this could be done on a long time frame today’s businesses move toward a continuous optimization and adoption of the GTM and their portfolios.

To enable an agile business process leader of companies have to gather as many streams of data around your business combine it with insight knowledge and make the tough decisions.

Specialists that turn this data into information where decisions will be drawn from so called data analyst, while the identifying of relevant data streams out of white noise is the job of a data scientist.

Of course today the time between analyze and decision-making is not quite short like it was often at the Enterprise, but the trend of more and faster data generation as well as access, more agile business grow as startups and compete with the established ones.

Looking on trends in IT departments of enterprises of all kinds, the desire for more agility leads to a cloud approach. This is only the first step, the last state is to be in the middle of the data universe and navigate their Enterprise thru the business solar system. The input will be overwhelming, new processes for sensors input needed to be developed and the crew aligned to the new command structure.

The engineering section, we would call it infrastructure, has to offer flexible and agile systems to answer the requests fast and right. One Key to success is automation, orchestration and standardization, and not dictation and a silo approach. Scotty will most probably fit into a data scientist role.

Star Trek: Phase II

Star Trek: Phase II (Photo credit: Wikipedia)

CIO´s will more become like captains to understand the challenges in this new space and align the crew and the rest of the ship to needs of the next decade. When cloud computing is the engine for agility, Big Data is the survival kit for the enterprise in the future. So Spock and Scotty are the two main assets of modern Enterprises  and James T. Kirk has drawn the right decisions from them, always.

Enhanced by Zemanta

BigData is Energy


Mankind is searching for energy since we have been crawl out o the sea. Naturally consuming various forms of energy to grow, to work to heat and other necessary items for life improvements. Over the time, we as a life form have achieved to collect more energy than we need for our basal metabolism.world_energy_300_clr.gif

So we started to invest into culture and leisure as well as technology. Understandable that this was not for all, but some. We started to trade and exchange this profit.

Nowadays-new forms of energy transports appeared like money or knowledge. Also in the last century we found enormous pools of energy, called oil and gas, nuclear as well as regenerative forms. This led to an incredible push in technology, trading and prosperity in parts of the world. While this happened the knowledge was one of the energy forms which was kept close and most nations and companies developed their own harvest of intellectual property (IP).

But than the Internet was born around 20 years ago. Knowledge was spread over the planet and an incredible innovation boost started. With now 7B people on this planet, knowledge is turned in to IP and the dimensions of this planet get captured in various methods. Mathematics cracks the complexity of our environment into formulas and we can understand the world we live in much better. Also more people have access to information, which centuries before was kept in libraries. So ideas can be born for demands never thought previously everywhere in the world.

With the social revolution of the Internet and the appearance of new devices, which make it easy for generations to feed the WWW with more data, pure knowledge got less important since all basic and complex questions seem to be answered with a tip of the fingers, or since recently, with a question to the iPhone. The requested data can now be seen in the best form mankind need to digest, reading, listening, watching; different form culture to culture. There are pictures, text, spoken, videos etc. However, the data is turned into information often in the brain of the user. Specialists have to explain the information and draw conclusions. Also all data or information is a snapshot in time. It has to be recreated often and frequently. This was not big issues years ago, but with the speed of chance happened in our business and life, more agile information was needs.

This is not the end. The Internet of things is approaching very fast and we see combing data from various sources produces new data. So where does this lead to: Big Data.

Another aspect becomes quite clear. Data has a value of energy, which depreciated over a while This energy is called Information. It depends on the data itself and may regain the energy level on alternative point in time. Like in auctions many data has a short, but powerful energy level, when the auction is running, before and after its value is much lower.

This is the power of BigData; it can leverage the combination of a huge amount of data and convert it in information, in real-time, structured and unstructured. The data itself is often not owned by an organization or individual, it is the combination of various sources. The value becomes thru the combination, realignment of new sources and optimization.

box_group_unorganized_pc_800_clr.png

Imagine a prediction of a tsunami catastrophic earthquake, which has a huge value if the prediction is extreme high and the time before this happen is long. People recognizing strange behavior of animals, ground, etc is the collecting of data. As described before the energy level of this information is often only relevant if you have it in real-time and adjusted to new information on the fly.

This is where the thought of Big Data can help tremendous. It will let new information generation happen in an actual fusion, various asp

 

ects in parallel and also generate new data, which can be leveraged by another big data engine. Overtime the interpretation is key; facts will be done in instantaneous. Believe it or not with all the twitter messages, may most of them are useless for you, can be done a newspaper which is more accurate and neutral than ev
ery editor will ever be able to.

BigData will change the world; it is already staring. If it is done right the energy will be enormous and everywhere, not dependent on the culture, more on the people itself providing the fundamental data.