Enable decision-grade digital twins that validate future operational behavior
We expected a replay-based digital twin, but what stood out was that the system behaved independently. Historical alignment stopped being the measure of trust once we saw how the model responded when conditions changed. It became the first digital twin we were comfortable using to test future scenarios.
Executive Introduction
Digital twins have become a standard component of modern drilling and well operations. Yet despite widespread adoption, most digital twins are not trusted for decision-making. The reason is structural: they are designed to replay what already happened rather than behave like a live system that can produce alternate outcomes. This case documents a materially different result — a digital twin that reproduced real operational behavior with measurable fidelity and, critically, could diverge correctly when conditions changed.
Organizational Context
This case involved a large, technically sophisticated service company responsible for complex managed pressure drilling operations across multiple assets. The organization had extensive experience with conventional digital twins, MPD simulators, and historical data replay tools. These systems were routinely used for post-job review, training illustration, and stakeholder communication.
However, they were not used for validation.
Engineering and operations teams did not trust existing digital twins to evaluate future scenarios or pre-deployment decisions. The systems looked realistic, but their behavior was tightly coupled to historical data streams or scripted logic. If conditions changed, the models either failed to respond meaningfully or required manual reconfiguration and restart.
As a result, digital twins remained explanatory tools rather than decision instruments.
How the System Was Used
Endeavor’s platform was used to create a live digital twin of a completed MPD operation. Recorded rig PLC data was ingested directly, and the well configuration was instantiated automatically using DOT, eliminating manual reconstruction and interpretation error.
Rather than replaying data, the digital twin executed as a continuously solved runtime system. Pressure, flow, and equipment behavior were generated dynamically by the model while being compared directly against recorded operational data.
Engineering teams overlaid simulated outputs against real data to evaluate alignment. They then altered operating conditions intentionally — modifying pressures, flow constraints, and timing — to test whether the system could diverge appropriately while remaining physically coherent.
Characterization of the Structural Change
Conventional digital twins are structurally descriptive.They depend on synchronization, signal replay, or fixed solvers that assumehistorical conditions. While visually compelling, these systems cannot producealternate futures without being rebuilt or re-run.
Endeavor’s digital twin did not depend on replay. Behavioral alignment with recorded data exceeded 99.65%, including equipmentactuation timing, without curve fitting, scripting, or forcedsynchronization.
More importantly, when conditions were changed, the systemdiverged immediately and correctly. This confirmed that the model was solvingphysics rather than reproducing signals.
This capability marked a decisive shift. The digital twinwas no longer bound to history. It became a validation-grade system capable oftesting what would happen if — not just what happened.
“The twin behaved correctly when history no longer applied.”
Value Captured & Realized
Knowledge and Insight
Engineering teams gained confidence that the digital twin represented causal behavior rather than appearance. This eliminated long-standing ambiguity over whether deviations were artifacts of the model or genuine operational risk.
The organization developed a clearer understanding of which outcomes were sensitive to timing, sequencing, and interaction — insights that cannot be extracted from replay-based systems.
Operational Impact
Pre-deployment validation cycles shortened materially. Rather than relying on physical testing, conservative assumptions, or post-deployment correction, teams could evaluate operational readiness within the digital twin itself.
Across deployments, validation timelines were reduced by 30–50%, accelerating decision-making without sacrificing confidence.
Cost and Risk Implication
Avoiding even a single MPD non-productive event — often measured in $500,000 to $2 million USD depending on rig class — justified adoption. More importantly, the organization reduced reliance on tools that could not be trusted to diverge, lowering exposure to high-consequence surprises.
Established Outcome
A digital twin that cannot diverge is a visualization. A digital twin that diverges correctly is a validation tool. This case established Endeavor’s platform in the latter category, redefining the role digital twins can play in high-consequence operations.
Closing Perspective
Most digital twins explain the past. Endeavor’s digital twin tested the future. By replacing replay with behavior, the platform transformed digital twin technology from an illustrative asset into a decision-grade system. In environments where timing, interaction, and consequence matter, this distinction is not academic — it is operationally decisive.
Back to Case Studies