The Quiet Restructuring: When Frontier Models Meet Legacy Reality and the Rise of the Context Engineer


Over the last three years, something has shifted in enterprise IT that is harder to name than to feel. It is not one technology, not one framework. It is the slow realization that large frontier models — the advanced reasoning systems from major AI labs — have stopped being an experiment and started being a structural force. They sit now in the middle of how we develop, how we operate, and how we think about the people who do this work.

I have spent thirty-five years in enterprise technology, from mainframes through cloud-native. And I have never seen a shift that touches so many layers simultaneously while being so quietly underestimated in its organizational impact.

From Writing Code to Owning Lifecycles

For most of my career, a developer was someone who wrote code. Good code, hopefully. But the primary measure was always output — features shipped, bugs fixed, lines committed. That model is dissolving.

When advanced coding systems from major AI labs can produce a working function in seconds, typing code loses its weight. What gains weight is everything around it: understanding what should be built, validating that what was generated fits the architecture, and owning the lifecycle from deployment through decommissioning.

I saw this when one of our teams used a state-of-the-art foundation model to refactor a payment processing module. The model produced clean code in minutes. But it took a senior engineer three hours to verify that the refactored logic preserved every edge case from twelve years of business rules. That three hours was the real work.

The Amplification Trap

There is a real danger that I observe in organizations moving fast with these tools. I call it the amplification trap. Because frontier models are so capable at producing plausible output — code, documentation, test cases, infrastructure definitions — there is a tendency to trust without adequate verification.

When I started my career, a junior developer who copied code from a manual without understanding it was considered negligent. Today, a team that accepts AI-generated Terraform configurations without reviewing them against their security baseline is doing the same thing, just faster.

The skill requirement has shifted. We need people who can read generated code critically, who understand architectural patterns deeply enough to spot elegant but wrong choices, and who have the discipline to say “let me verify” instead of “ship it.”

The Context Window Problem as Architecture Constraint

Here is something that few technology leaders discuss publicly but that will define the next wave of enterprise modernization: the context window is an architecture constraint, and a hard one.

There are legacy codebases. Million lines of older languages, like Cobol or Delphi. Built over two decades. It works. It runs critical business processes. And it does not fit into a context window. No frontier model today can ingest that codebase holistically and reason about it as a whole. The model sees fragments. Isolated modules without the web of dependencies that give them meaning.

This led me to what I consider a genuinely new role in enterprise IT: the Context Engineer. This is the person who fragments, indexes, and prepares legacy code so that AI systems can consume it meaningfully. They decide which 40,000 lines of a 300,000-line module matter for a specific modernization task. They build the retrieval layer that feeds right context to the model at the right time.

Your modernization speed is no longer limited primarily by AI capability. It is limited by how well you have organized your legacy knowledge for AI consumption. The Context Engineer determines the modernization velocity. I have not seen this role in any job description yet, but it will be there within two years.

Less Applications, More Agents — The Enterprise Shift

Something equally fundamental is happening on the business application side. For decades, enterprise IT meant buying or building applications — ERP systems, CRM platforms, reporting tools — each a monolith of screens and workflows that humans navigated manually.

What I see emerging is different. Advanced AI systems enable a shift from applications to agents and workflows. Instead of a procurement officer navigating seven screens to approve a purchase order, an AI agent reviews the request against policy, checks budget, flags anomalies, and presents only the decision point. The human still decides. The cognitive overhead is gone.

This means less management overhead. Not because managers are replaced, but because information preparation — collecting data, formatting reports, chasing updates — is increasingly handled by intelligent workflows operating on massive amounts of data. What remains is the core: judgment, decision, accountability.

I see operations, where it was moved three reporting processes from manual Excel assembly to AI-driven data pipelines. The time saving was significant. But the real gain was that our team leads could focus on interpreting data instead of compiling it.

The Psychological Dimension

What concerns me most is not the technology. It will get better. What concerns me is the psychological impact on people who have built their professional identity around skills that are visibly changing.

A senior Delphi developer with twenty years of experience watches a frontier model generate code in a language they do not fully know. A system administrator who spent years mastering infrastructure sees an AI system propose a complete IaC deployment. These moments touch professional identity.

The honest answer is this: the experience of those professionals is more valuable now, not less, but in a different way. Their deep understanding of how systems behave in production, of what breaks at scale, of where business logic hides — this is exactly what models cannot learn from training data alone. The challenge is helping people see that shift as an elevation, not a loss.

So What Changes for People?

Everything and nothing. The tools change. The speed changes. But the fundamental truth remains: someone has to understand what the business needs, someone has to ensure systems are reliable and secure, and someone has to be accountable when things go wrong.

What changes is the shape of the skills. Validation over generation. Architecture thinking over implementation speed. Context engineering over raw coding. Critical reasoning over mechanical execution. And the willingness to learn continuously in an environment where the ground shifts every few months.

After thirty-five years, I still learn something new every week. The learning curve is steeper, the tools more powerful, and the margin for complacency smaller than ever. But the people — the developers, the platform engineers, the security specialists — they remain at the center. Not because it is comforting to say. But because it is true.

Transforming the Enterprise


In the recent years the change in a lot of industries have arisen from an traditional business approach, which was developed over decades to a software defined version of that. There are compelling reasons why this has happened.

(c) by presentation load

(c) by Presentation load

When the industry develops products often it takes decades that this comes out. Take cars for example, all development from cars currently in mass production are start to build and design in the last 5 -10 years. Updates will often go into the next generation of the product cycle.

This is obvious if it is mechanical, but on software it can be much faster adopted. A good example is Tesla Motors which changed the Industry with a concept building a computer in from of a car. Nightly software is updated over the air and new functionality is available for the drive or passenger. But not only this has changed also the selling of that kind of car is different. While for traditional car dealers it is a exercise to train all the sales personal on new function and features, new leasing models or service capabilities to explain this to the customers, modern companies change the sales structure to the internet with an easy to update and adjust model. This leads that options and selling capabilities more depend on the flexibility and creativity of the company, not on the salesforce and their adaptability. The new model traditional Enterprises stumble into demands deeply a adoption of an agile and innovative behavior and processes to leverage the demand and open new segments of making business with.

Why is this happen

Because it is possible. With the appearance of cloud and the models supported thru that, Startups have shown that it is easy to build business without a large invest into infrastructure or DC. Even more, in the past you have to ask investors for a large amount of money to build the DC now you can pay while you build your business. This is much more enabling the investment of the capital in the business model and not into the IT landscape. But this is only one aspect. With the commodization of IT resources, and the container based IT, it is much more cost efficient and reliable to build enterprise class IT with a minimum of investment. However, there is a trap many companies will fall into, which is standardization. Currently there is a believe that one cloud standard, driven by cloud providers, can be the right one, but history has shown that this will lead to more cost and will be replaces in time by an Industry association. We see this on the horizon with OpenStack already, which this is still far of enterprise ready. The key will also be more in the PaaS layer with open software, like CloudFoundry and Docker, which opens a broader Eco space for applications and operations.

Innovation HandIllustration by Dinis Guarda

Innovation HandIllustration by Dinis Guarda

So what about to enable the “New” Enterprise model

The new model will be driven thru innovation in software and applications. With my daily talks to large companies and customers many of them think about how to implement this two aspects into their business process modelling. Often it is driven out of the IT department, but the link to the business and the drivers are missing or simply not established. T I see large enterprises and global companies investing in application development thru the Line of Business and building a second IT knowledge, which is more enrich with the business than the agile development. This not only leads often to a wrong assessment of the best development environment, it also creates a new class of information islands. In the long run this will not be the right innovative approach for many enterprises, but it let adopt and compete with the new kids on the block, the startups, much better. My advise to the CIO and cloud architects is always to engage actively with the CIO departments and help them to change to a more agile and innovative model, we call that, continuous innovation, but also get in return the IT expertise to make the right strategic decisions for the company.

IT provider, like EMC and the federation, enables thsi process and guide also thru that, with various iterations EMC has possibilities to analyze the  current status of an IT department and show the path from a 2nd platform concept to the modern web scale architecture, the 3rd platform concept demands. Since this is not a “shoot once and forget” also in IT terms the “New” model is a constant change. Was it in the past a management of resources and strive form more synergy and “innovation” thru new HW/SW will be the next decade the IT departments more a broker of public and private cloud, may be also for other companies as an additional service.

How to proceed ?

It is not simple and has to be step by step, since the current change of the business model in many verticals not only driven thru development and operation aspects, it also deeply influenced thru big data concepts, which often lead to a Internet Of Things discussion. Silos and public cloud may be an answer, the key to success I see in many cases with a joint effort of the business units and the IT responsible people in the enterprise.

What kind of Analytics ?


In various discussion with my customers and colleagues. I experienced a very controversy discussion around analytic.

Some understand the „what has happens,” also called root cause, others want to predict the future, meaning „what will happen,” a third group tries to answer the question, „what could happen“. Funny enough often is the answer a combination out of a combination of these. When it comes to our very fast decision making and consuming society this can cause some friction.
In the first case, lets call it descriptive analytic, it is the reason many companies invest money to avoid that, but after the incident, not predicted, happens.2434048_300dpi

 

 

Often you find it difficult to get to the root cause, since simple the data or information is no longer available to get to the bottom. In the 2nd case, statistical model show the business based on historical information what can get wrong, lets call it predictive analysis. In the last case, data scientists use machine learning to find the possible future, based on different information streams, it is known as prescriptive analytic. Like in chess where in the first move all possible is and there some prediction could be made, based on the information of the players, over time when more moves have taken the information and moves getting clearer.

 

The human brain is capable of doing this kind of prescription on the fly, we call that experience. By the way a reason why often young managers or entrepreneurs fail, but this is worth another blog. The key is here that the brain has build, by and expert, enough information and channels that the brain could do both predictive and prescriptive.
Having the scientists build the model, means the mathematical representation of the information combination, it can lead to a variety of possible outcomes, we call the data science or machine learning. Key is the the amount of information, it deep and historical very long. This is also on of the boundaries.

 

Many CIO and CTO from companies I talk today do not keep the data, for many reasons, mostly cost. So what can a Data Scientist than do ? Simple he build the model and than run it. Over the time it will be better, preserving the information. Like in the chess where you can see the extreme professional players analyze and predict movements in depth untrained could not imagine , or even provoke movements. This is not yet in this science embedded.

 

A couple of week ago I was fortunate to see a start-up and talk to the founders. It was all about customer intimacy and combining information in a company to serve a better customers relationship. I would drive this one dimension deeper. Why not using this as an internal knowledge base. Let´s predict what the corporate user needs to find around a specific subject, not only internal sources also external can be used here.

 

I am still wondering why HR representatives and headhunter not use predictive analysis to identify the right candidate for a job?
Yes you’re right the information in a company is often so filed that we will not have the possibility to combine them. Here we go with the concept of a data lake, building one repository of information and use it for internal and external benefit.

prop-jet engine

However the last topic is the „what could happen“ case, the descriptive Analytics. This is all about possibilities and often risks. I guess this is used in Six Sigma, and also man kind is used to do that. The mother warns the child about the possible accident, when walking on the street. The company strategic plans is all about that scenarios. I think this method means „I do not have enough data to be more precise. „

 

So what to do? Keep all the data, maybe randomized and anonymized, but keep it, because you never know what business can be build out of the treasure the company has generated over years.

 

Welcome in the Age of Information !

 

 

 

 

Disrup the Disruptor


Platform

Platform (Photo credit: Geir Halvorsen)

In one of the recent interview the CEO from Pivotal, Paul Maritz, stated that he wants to disrupt the disrupter. The AWS platform offering is addressing a majority of the public cloud demands today and Pivotal wants to play here strong too.

So what is the disruption AWS is addressing?
“Fast”, DataCenter by credit card, to name one. “Easy” maybe another. “Cheap” is also stated. All this has nothing to do with a conventional DC, more the opposite, slow, complex and expensive is more what CIO´s today see in their environments. This is the disruption, there is only one missing piece, “Trust” or “Risk”. Is this not what DC are build for?
Let´s circle back in time, where the Mainframe was born, lets call that a platform one. The majority of applications written there have been determined the production systems. Of course there have been litte users and litte amount of demands, but the origin was done thru the developers.
History has shown that there was only one player left after 30 years. The programs still running and working, but hard to maintain and not able to be inline with current standards of customer experience.
With the age of the PC and more with appearance of Linux and x86 architecture the second platform was born, we called that client-server architecture. Also here the developers led the way with creativity and innovation. Many languages where born, ended up in the internet itself. The HW was still very expensive and followed moores law for years.  Microsoft got the dominant player. This generated the complexity in the DC, since the legacy has to work with the new stuff. Silos where build and middleware controlled the systems.
“Suddenly” Google and Facebook like companies where borne and storage, CPU and Network more or less have been given for free. They connected billions of users with millions of applications. The App century where born, lets call it third platform. And again the developers paved the way. From my perspective most of the current App`s are specialized browsers which have a nice interface to Informations.
Here we go thats the buzzword we will see more and more in the future. Information, which is data with meaning, was always generated in in companies and kept very close. Now OpenData appeared, mobile devices generate more data and Information suddenly could be enriched to get new business models in place.
Have, in platform 1 and 2, business leaders invest in DataCenters, not knowing how large the business will go, now they can purchase non the trip and on demand via a credit card, nice.
Many apps and services consumers today buy will leverage such infrastructure.
Hence, guess what, what moste of the the startups leverage this model.

 

Pivotal Labs

Pivotal Labs (Photo credit: teamstickergiant)

 

 

Now how to disrupt this growing business?
The answer is in the definition what business. Large global companies will have big troubles to run their business under the current offerings. Even the current cloud provider adopt fast and be very creative, there will be a lot of legacy which cannot transported. Like the mainframes not to the client-server architectures. The key is in combination, or bridging. Investments which have been taken and will be taken to support the current business models have to be adjusted for the next generation architecture. EMC`s federation approach is aiming for that. Complexity of Infrastructure will be solved by Converging and Software Defined – X concepts, Optimization and orchestration of the infrastructure vmware is leading the industry by a magnitude and pivotal will provide the open platform architecture to combine the business needs form the current and future demands.
When we talk technology, Cloud Foundry and major parts of Pivotal One, which comprises many supreme technologies are open and crowed developed to exponential capture the great ideas from this planet. This is the real disruption. In Platform one it was very country centric, Platform two was dominated by the thinking of the western world, the next platform has to address a global demand and population.

Enhanced by Zemanta