The MS-Dos Phase Of BigData


The first developers of IBM PC computers negle...

The first developers of IBM PC computers neglected audio capabilities (first IBM model, 1981). (Photo credit: Wikipedia)

Most recently I was invited to talk to various CIO´s in the German context. The focus of my presentation was about the next years and what to expect from society, how IT changes everything, how to prepare and what to educate the employees on. IDC talked about the concept of the 3rd Platform. Thinking about the BigData hype while preparing the day, I was reminded on my times with MS-Dos. There was not only one approach to the “PC-Aera”. There was DR-Dos, Apple, still Mainframes etc.

One of the reasons MS-Dos gets more attention was the affinity to IBM and the combination of a device with a market place at this time. Also IBM let others open the space to develop software, add-ons for the PC etc. This generated momentum.

Seeing back the development to Windows 7, has taken a close combination of physics, like Intel, Software, like Microsoft and a change in the market.

The same we currently experimenting with cloud, virtualization and the future stage of an Software defined Datacenter, I guess we have long passed the MS-Dos phase.

So what is the MS-Dos phase?

Plenty of players, not a defined market, overselling of functionality, misleading abbreviations and very poor educated experts, best are from the vendors itself.

With all my talks to the CEO´s, CTO´s of industries, it has the same questions open:

  • What can I do with this technology?
  • Who can help me to build out the business context? and
  • Is this already time to invest into?

There is a clear answer, it depends !

It depends on the maturity of the IT department:

  • are they still maintaining the IT or do they drive innovative, business related IT processes.
  • Where is the infrastructure? Is the IT department still deploying Storage, Network and Servers, or they running a converged infrastructure with a services layer?
  • Do IT departments still focus on education on infrastructure or they hiring Data Scientists ?
MD Tag: MS DOS

MD Tag: MS DOS (Photo credit: shawnblog)

If you can answer this for your IT department, you are much closer to leave the MS-Dos phase behind you.

In the past, many business models have been established and run for years. This will be no longer true in many traditional businesses. Also this business is often in MS-Dos phase, but it will adapt thru the new market drivers, consumeration, urbanization, mobility, Maschine-to-Maschine decision processes and new adaptive computation processes which adjust the business to the demands of the users faster than realtime.

New concepts are unpredictable, like crowd sourcing, open source, raspberry pi physics, and machine learning. There is only one big truth, from MS-Dos to OS X it will take not 20 years, to will happen tomorrow.

Enhanced by Zemanta

My top 10 anticipations for 2013


The cloud will emerge in all parts of the IT

More services have been emerged in 2012, the adoption will drive more cloud, and more cloud offerings will parer. This circle will speed-up and drive more Enterprises in evaluating the next private cloud.

bigdata_network

The engine will be build in the manufacturing not at customer premise

In many enterprises 2012 has started a evaluation of converged infrastructure. Since most of the used components are standardized the „build on customer site“ or „do-it-by-myself“ will be more and more questioned by the CFO. Like in the past where whole servers where build by locals or enterprises itself, it is obvious that the purchasing department will have a different look when it comes to TCO. This will also start to include backup and security.  The order will be moving to workloads and software demands rather than core´s and PB of storage or Network interconnects. The VCE model will become the standard delivery methods for modern DC architectures.

Consummation of IT will drive more business adoption in the cloud

The DC will have more direct contact with the end customers. This was in 2012 one of the main drivers, build thru Google/Android and Apple/iCloud. This will move faster. The new paradigm enables enterprises to optimize cost and optimize the business model when direct talk to the end-customers. In 2013 we will see that this shifts also into computer-to-computer relationships. The interaction of b2b mash will lead to faster purchasing and information flow to optimize the business, and early adopter will be the winners. Key here is an agile IT infrastructure.

Cloud will feed BigData, BigData will enable more cloud

Each one is depended from another. With more cloud, the information streams are better to combine for BigData Analysis, with more BigData analysis and applications, more agile computing environments (called cloud) are necessary. This will be a big trend, but depending on the industry. New businesses will learn to use the information streams, new forms of analytics will be generated to enhance the processes and decision making progress.

Flash will be leading the storage industry

The last years flash was used as replacement for spinning disk. This has a huge impact in performance and utilization of computing power. Man-years of brain power where spent to develop smart algorithms to leverage the new TIER in the storage. This will continue to evolve and with the drop in price and increase of capacity the market will grow dramatically. However flash has not only be squeezed into  spinning disk, it also allowed to transform and implement new algorithm´s to utilize the power of flash. When IO´s no longer the limiting factor, CPU power can be used to leverage the performance in the storage system to do much more tasks which delivering high performance to the server´s.

The CIO will refocus on the core: the Information

With all the changes in IT, the transformation of the roles in the IT management will not stop. Since the DC gravitate to the information it is obvious to see that the CIO will be the master of the information. Not only the information processing in the own DC also the information needed form outside, like BigData applications, also informations given to outside, i.e. to customers. New technologies like the app horizon manager from VMware will support this transition and the CIO and team will be transformed into the information broker, security agent and consultant for the business lines.

Standardization will enable more productivity

One aspect of going from private cloud to hybrid is the standardization. Many companies, like Microsoft, Google or Amazon u.o. define the API and push them as de-facto standard. Experience showed that the early adopter often drive this, but run against a wall in a couple of years. Ethernet/IP and FC would not have been so  broadly accepted if there was not a standards body formed. We see currently various associations to  take on the role in the cloud, like the SNIA organization. This is the only way to help the DC out of the „do it by my self“ and focus on more business relevant tasks. The engine (converged infrastructure) will be developed and assembled in the vendors premise and the Enterprise DC managers can focus on the utilization.

The storm on clouds will drive orchestration

When, in the past, virtualization was introduced to customers, the VM landscape was able to be controlled by man kind. It is similar to the SAN in the early 2000´s when the storage array was still close to a server. This will continue to chance, when we will see 10.000 of VM´s in modern DC architectures. The orchestration will lead to the agility which is necessary to drive more business flexibility. Vendors, led by VMware,  will provide here more sophisticated solutions to automate.

Keys.

Keys. (Photo credit: Bohman)

Security and Trust will be in the middle of everything

Since Information is the key, like it was in the past but now it is a more open world, to secure is one of the key elements. The business will ask for answers, and companies like RSA will lead the way. Not only to secure also to trust other organizations is essential. With new regulations and demands of the business information has to be more trustworthy.

InMemory systems will draw investments from infrastructure

Since InMemory systems show more demand on the „main“-Memory than Hi-IO rates, it will re-architect the infrastructure in this area. 2013 will show if this new technologies will add new fields of applications or replace other. Technologies like GemFire, Hana u.o. will drive faster decision making and new infrastructure architectures. Combined with flash companies like SAP and EMC will drive the Industry here.

Enhanced by Zemanta

Keynote at the VMword 2012, with Pat Gelsinger and Steve Herrod


VMworld Barcelona

VMworld Barcelona 2012

Pat Gelsinger is on stage in is new role as CEO of VMware

Pat starts with an back view of the history. Waves of change has been the only constant at the IT industry. Led by IT innovations and driven thru technology. He talks about phases of IT maturity, reactive vs. proactive.
The transformation is in each layer starting with consumer, IT departments, people, operations and procedures. It began in various layers simultaneously and he talked about the infrastructure layer first of course.

From server to cloud.

This means from 25% in 2008 server are virtualized, today it is around 60%. The expectations that it will be in the near future more than 90% of all workloads virtualized.  This has also impact on the provision of servers from week, to days and in a view years to minutes and seconds or less. This only can be done by introducing a new paradigm automation

Next topic: software defined datacenter
Introducing the VMware perspective. Means that it is everything virtualized. Pat starts to talk about a huge legacy in the DC of today; the only solution here to end the dilemma is to abstract, pool and than finally full automate. Leading to the concept in manufacturing on „Just in time“ or all as a service.
The Software defined Datacenter is based on the vCloud suite, the basic element of the beginning of the journey.

Pat Gelsinger on vCloud Suite

This suite it is comprehensive, deliverers the highest performance, proven reliability, check it out.
Pat announced now that vRAM will no longer exist. based on the customer feedback VMware has decided that this is no longer the way to go. The new model is

  • priced per CPU
  • one easy solution
  • has no limitations

Back to the Software defined datacenter, diving into the management philosophy, more automation, less management, which will lead to IT broker of services also mean that the management must change.
The  policy based automation is the key to manage the future datacenter, eliminating time and human errors in the equation. This equals
service provisioning including vCloud Automation, vFabric Application Director,
operation management including vCenter Operation management suite,
Business management including IT business management suite.

The CEO now talks about Cloud Infrastructure around software, technology and architecture. This leads to the next area: How do I operate in the new World? its about People, culture and org, as well as Process and control and IT Business management
A new forum is build to shape the ecosystem, call „cloud ops forum
Moving on to multi cloud environment, like vCloud, physical, non/VMW and public.
Pat mentioned  PaaS is key represented thru cloudfoundry and automatic service provisioning DynamicOps, and software defined networking and security, keyword is nicira.

Addressing the multi cloud world
The VMware executive claims that it is ready for the open world.
Next high light is on applications, to move apps into new dimensions from vFabric on application transformation.
This all has been only be able to drive this transformation it was only be able thru the huge partner system.
Pat now moves the mic over to Chief Technology Officer Steve Herrod.

Future Storage Directions

Steve start to talk about the next generation  vCloud suite, and start to dig into. VMware, vSphere, virtualization, focusing on Exchange, share point server and SAP as well as Oracle.

What is a Monster VM, drive this to the edge.

  • 2011: 32 vCPUs, 1 TB per VM and IOPS 1 M per host, 
  • 2012: up to 64, 1 TB and more than 1 M IOPS per host.

So what about tomorrow’s applications, So what are they, in Telco, VoIP issues is latency, VMware is looking into that , based on vFabric, they look into the shared virtual memory, also HPC is coming to to virtualization, PaaS like cloudfoundry, introducing more on the Hadoop integration.

Enhanced by Zemanta

Learn from the past, think about the future, but live in the now !


When I prepared for my last tests at the University, one of my professors mentioned that they intend not to guide us to find good jobs immediately after the University, the key is that we as graduates have longer term chances to find and keep excellent jobs. At this time I thought it was stupid, my profession starts with my first job and not in 10 years.

Die of an Intel 80486DX2 microprocessor (actua...

Die of an Intel 80486DX2 microprocessor (actual size: 12×6.75 mm) in its packaging. (Photo credit: Wikipedia)

Time showed to me, that it is key to have both in mind when you start and it is essential to stay laser sharp focused.

Same in datacenters, when CIO´s get confronted with the new reality of BYOD, data deluge, cyber attacks and complexity explosion. Positions and processes years worked no longer seem accurate, personal gets confused, and the business units claim to have often more IT knowledge, since they use flexible services on the Internet. The CIO´s and IT managers I talked to often seek for external help, call consultants, which tell them that all has to change, processes has to be reworked and the HC has to decrease.

What a challenge!

Most people underestimate the finest attitude mankind has is: the best ability to change and adapt. Of course this is not easy, people like to be in the comfort zone, don`t like to get moved or learn new skills, move on and modify, even they do this all their lifes.

Most successful IT organizations I know have arranged with that to a superlative, taken the employees on the road and established a culture of change. As we learn from the past the change never stops, we moved from monolithic mainframes to mini´s to server to PC and now to tablets; from proprietary OS to open source from complex instructions CPU´s to RISC and back, from multiple CPU architectures, to multiple server to multiple core architectures. From ASCII terminals to mouse driven interactive PC´s to gesture tablets. From 640 KB RAM to Tb of RAM, from 5 MB Disk drive to 4 TB and now flash technology. So we know the past, we understand that telecommunication 10 years ago has been an enterprise asset and now is an outsourced, VOIP driven business.

So what will the future bring to plan, new devices, like the Google glasses, faster storage, from keyboards to voice driven interaction with computers, to InMemory DB build in CloudOS, from Applications to functions, from physical infrastructures topologies and silos, to software defined datacenters. From Application centric approach to information centric concepts, from blade farms to virtual workloads which independent from the blades will be executed?

We will see computer talk to computers, make decisions for us, devices will generate information, which only relevant for seconds. Batch processing will become too slow to keep pace;

This image was selected as a picture of the we...

This image was selected as a picture of the week on the Farsi Wikipedia for the 13th week, 2011. (Photo credit: Wikipedia)

it will be replaced thru bigdata algorithms.

On the consumption model we will purchase only what we consume, vendors have to deal with end customers enterprises will move legacy workloads to specialized and focused workbenches, all information has to be secured and trusted to be transferred.

So the key for the future will all about information, generates, capture, store, manipulate, destroy and analyze, we call it the gravity is on information.

Based on that, CIO´s and IT managers have to and do act in the now, to prepare. Build an agile infrastructure, make our investments in the Datacenter on skills, new technology, information and big data, that supports the workload management and secure the information, which this is critical to the enterprise or the privacy of an employee or customer. Investments in IT of an enterprise should help to build and prepare an agile datacenter around the information to be ready for the near future and bejond.

Enhanced by Zemanta

Running the most powerful iPAD


VMware View

In the last years the consolidation on one hand and the standardization on the other has lifted the expectations of the users to always on/endless power. Even compute intensive applications moved into the cloud. With the challenge of the  “multiple” device user experience many consumers of IT experienced the lack of integration. So applications like “dropbox”, “box.net” or even the project “Octopus” from VMware have been generated to consolidate information on one single source.

Enterprises have troubles with that since there is a layer of trust and control witch may leave the ship.  So to take in account that we use more than one device, depending on the personal lifestyle and the expectations from culture or business to be online more often, the devices need to be “always-on”, trusted and reliable. 

We call this user-experience, which VMware has taken in account and developed the virtual desktop infrastructure (VDI). This let run you desktop on a server and “beam” the screen to your device.

 

 

 

There comes a lot of advantages with that.

  • First and foremost, it can run all the time. There is no need to shut down at all. I am not talking about sleep, it runs. This implies that Outlook rules i.e. will be executed always.
  • Second, it is fast. Since all the VDI desktops run on the same HW or close communication is fast. Resources can be shared, Infrastructure is no longer carried away when the user is inactive. Others can be leverage the remaining power.
  • Third, administration can be done much more easily. Since I used the VDI for business, there was at least one incident where our AntiVirus provider has sent a corrupted update, which leads to a block of most of the laptops. Also the VDI was affected. However the VDI has been shut down, reinitialized and than rebooted to recover. More than 1000 virtual laptops has been restored in minutes with involving 2 employees. The physical exchange of the profile has taken days and involved most of the IT specialist in the field.
  • Forth, security is key in enterprises, even the VDI is not free from fraud, at least it can be better controlled by the admins and security patches are installed in one big shot.
  • Fifth, backup it is all in the DC, most of the information is redundant, so de-duplication like Avamar or DataDomain can be much more effective, which also restore is fast, since it happens in the DC.

So what is the catch? In my experience with the VDI it is hard for offline travelers and on very low latency connections.

 

To use it for my daily work enables me to use my apps like Outlook, PowerPoint salseforce.com on my iMac, iPAD and MacBook, at the same time, no boot time delay, easy access and 100% support of the IT department.

Even my iPAD is not the best device for producing work it can leverage all the legacy application I have to use and provide me thru the PiggyVDI the best infrastructure to run my desktop: an vBlock from VCE.

Many IT experts I have spoken see that VDI is the key for BYOD and demands of an Enterprise, but the infrastructure has to be flexible enough to compensate for the enterprise demand resources, since it is not easy to predict the amount of desktops running at the same time. This is not exactly true since VMware and VCE has good models and experience to design this.

 

My strong believe is that with the user centric IT, VDI will be the future of enterprise desktop management and it will deliver the power of the enterprise to the User´s device, quick, easy and reliable.

Related: 

VMware View demo on iPAD