Sensat News

Bringing digital twins down to earth

The future of digital twins - Part 1

June 16, 2025

The term "Digital Twin" often evokes a sense of a comprehensive virtual replica updating in real-time, offering seamless collaboration and predictive foresight. However, its very generality has led to a wide spectrum of interpretations and considerable confusion regarding its true definition, practical applications, and real-world value. This broadness results in a polarised view. On one end of the spectrum some remain skeptical of their utility, while the other end harbors unrealistic expectations that a single piece of software will serve as an all-encompassing solution to every challenge within their industry.

Like many transformative technologies before it, the concept of the digital twin has navigated the familiar curve of the hype cycle. Almost a decade ago, the world witnessed the peak of inflated expectations surrounding its potential. Since then, the general understanding and application of digital twins have evolved unevenly across various industries. Their application across different sectors has matured at different rates, and the use of the term has grown to describe vastly different systems. Early adopters within the AECO (Architecture, Engineering, Construction, and Operations) sector are now demonstrating tangible and substantial returns on their investments, signaling a move onto the more grounded "slope of enlightenment".

However, driving this industry forward is not only about technological maturity. It also demands a collective effort to cultivate a broader understanding of what can be achieved and what changes to business processes and best practices need to occur in parallel.

Source: Gartner (August 2018)

Cutting through the noise

Digital twins are often broadly defined as “virtual models of real-world physical products, assets, environments, or processes, serving as digital counterparts for purposes like simulation, integration, testing, monitoring, and maintenance”. It promises transformation but also breeds confusion.

A fully integrated, live, smart digital twin is undoubtedly a worthy long-term goal. However, framing it as the immediate starting point can be counterproductive, especially given the diverse levels of digital maturity and tooling currently present across the AECO sector. Drawing on insights from industry experts and innovators, we've identified three key pitfalls with this framing:

  • Discouraging early adoption: New buyers and stakeholders can be deterred by the sheer scale and complexity of an all-encompassing system.
  • Unrealistic expectations: The promise of a fully integrated, real-time twin often overshadows the practical hurdles of data and system integration.
  • Delaying tangible benefits: Trying to run before you can walk obscures opportunities for faster ROI through targeted applications addressing specific AECO challenges.

The solution lies not in the pursuit of a fully integrated, real time digital twin from day one, but in a fundamental shift in perspective. Instead of asking “how can we build the ultimate digital twin?”, we should be asking “what are the specific problems we are trying to solve, and where can digital twins deliver measurable value today?”. This approach, grounded in clearly defined use cases, allows us to strip away abstraction and focus on tangible outputs. Let the problem guide the technology.

Across the AECO sector, Sensat has already seen digital twins driving real results. Each use case exposes a distinct set of data, collaboration, and automation requirements, all shaped by the problem in hand and the return on investment it can deliver.

Optioneering

Planning new infrastructure, such as a power grid expansion, involves navigating a multitude of geospatial and social constraints to narrow down options. Digital solutions range from automated assessment of 1000s of scenarios against defined criteria, through to interactive environments where planners can manually move objects around like a high-stakes SimCity – the crucial outcome being quick and pertinent feedback on what-if scenarios, saving significant time and resources, and ultimately leading to more sustainable and socially responsible infrastructure.

Federated design validation

Major AECO projects often have several contractors coordinating multiple designs and managing complex interfaces. The goal is to identify clashes as early as possible. Requirements can range from automated analyses on micro elements of detailed BIM models, up to professionals eyeballing broad design compatibility issues with the natural environment. In both cases errors can be caught in the digital realm well before action on site, saving time and money on rework.

Logistics and site management

Preconstruction digital rehearsals, often performed months in advance, use a static snapshot of data to identify potential hazards and scheduling optimisations. Daily activity briefings, by contrast, necessitate continuously up-to-date site information to coordinate personnel, equipment and materials effectively. Both approaches aim for proactive risk mitigation and efficient resource allocation but different levels of detail and timeliness of data are suited to each stage of the construction lifecycle.

Asset management

Combining as-built, operational, and maintenance data enables the optimisation of asset performance over time, be that short-term fine-tuning or longer-term portfolio-wide enhancements. This can involve the analysis of high frequency sensor data to, for example, track flow rates to prioritise wastewater management interventions. Alternatively higher level statistics such as aggregated traffic volumes can be used to proactively schedule maintenance and minimise disruptions. This again highlights the variety in technical capabilities needed to deliver on similar goals.

Ask yourself - “For this specific problem or opportunity, how similar to the physical world does our digital twin truly need to be to deliver meaningful value?”

The 10 Dimensions of Digital Twins

In the above we discus the confusion surrounding digital twins and the importance of focusing on specific use cases to deliver measurable value today. But once you've identified a use case / problem you want to solve, how do you define the requirements for the digital twin needed to address it?

Effectively answering the question above of "how similar does my digital twin need to be to reality?" demands a structured approach – to that end, we have defined the following framework. Taking the time to build up a thorough understanding before diving into technology solutions will result in a faster path to action and ultimately better ROI.

Sensat's 10 Dimensions for mapping digital twin use cases

Scope: Defining the reach and duration

1. Spatial extent

Defining the reach and duration. The extent covered dictates the broad nature of data to be managed and the scale of the insights that can be derived. This dimension defines the physical or systemic coverage that a use-case applies to. It ranges from focusing on a singular independent asset through to national networks.

2. Life-cycle

The breadth of the life-cycle covered influences the data retention and auditability requirements. This dimension defines the stages of an asset or portfolio that the digital twin actively engages with. It spans ideation/optioneering and planning through to operations and decommissioning.

Workflow: Understanding human and machine interaction

3. Audience

Understanding the audience dictates the UI/UX design and the types of insights and presentation required for a given use-case. Unlike the other dimensions this is not a spectrum with a clear continuum or linear progression, instead it is multi-valued. A given use-case can involve 1 or more parties who will interact with and derive value.

4. Participation

The required level of participation influences the need for shared data environments, collaborative tools, and structured workflow/processes. Participation ranges from individual analysis, where a single user interacts with the twin, through to cross-organisational efforts that bring together different areas of expertise and responsibility.

5. Automation

Understanding the status quo and projected value of optimised workflows enables a critical assessment of the required level of automation needed for each activity within a given use-case. This has implications across multiple components of any solution as not everything will have, nor need, the same level of autonomous operation. At the base level there is little to no automation, requiring manual input and analysis. At intermediate levels tools augment human activity, removing manual steps and providing guidance or recommendations. At the highest level fully automated workflows require a system to independently perform tasks such as data processing, anomaly detection, report generation, or even control actions based on its insights.

Data: Characterising the information foundations

6. Level of detail

The level of detail dictates the nature, volume, and diversity of data to be modelled, and the scale of the insights that can be derived.This dimension describes the granularity and richness of information required to fulfil a particular use-case.

  • Conceptual/Schematic (meta): representing high-level spatial relationships and overall arrangement and functionality e.g. simple bounding boxes, zones, blocks.
  • Simplified/Massing (macro): providing a generalised representation of objects and geometry, but lacking surface details or architectural features e.g. massing models, building footprints and overall form, road outlines and connections.
  • Detailed/Component (meso): aims for precise dimensions and features with embedded data for specific elements e.g. detailed building components, accurate utility asset locations.
  • Elemental/As-built (micro): reflects full architectural details, geometry, material properties, and even comprehensive manufacturing and operational data.

7. Spatial accuracy

The required level of spatial accuracy has implications on data capture and the nature of visual and analytic tooling. This dimension defines the geometric precision of the geographic information within the digital twin. The spectrum ranges from relative positioning within a model, where the spatial relationships between elements are accurate but not tied to absolute real-world coordinates, to absolute real-world coordinates, where all spatial data is accurately and precisely georeferenced.

8. Realism

The level of realism is driven by the need for intuitive understanding and presentation. This has direct implications for the type of data input and the digital outputs required. This dimension describes the visual fidelity of the digital representation of the physical asset or system. The spectrum ranges from abstract/schematic representations, focusing on key components and connections, through to photorealistic renderings indistinguishable from reality.

9. Latency

The latency required will vary hugely by use-case and has large connotations for all aspects of a digital twin solution, from data ingestion to modelling, analytics pipelines, and user interfaces. This dimension defines the timeliness of data updates within the digital twin. The spectrum ranges from infrequent annual snapshots through to real-time streams of live conditions.

10. Standards

Data standards inform the connectivity and modelling capabilities of a digital twin. This includes support for specific formats and standard interfaces as well as integrations with other enterprise systems. Unlike the other dimensions this is not a spectrum with a clear continuum or linear progression, instead it is multi-valued. A given use-case can involve multiple data sources and formats that need to be handled.

Illustrative examples

Illustrative analysis of potential use cases based on the 10 Dimensions
Illustrative visualisation of use cases mapped against the 10 Dimensions

Final Thoughts

We’re in a time of immense financial, regulatory, and socio-environmental pressure to achieve more with less. Digital twins will play a key role in driving greater efficiencies and they are already starting to deliver "auto-magical" capabilities (*cough cough*, Orion). Nevertheless a strict focus on use-case-driven development remains essential to realise value incrementally.

Developing large-scale infrastructure is a complex business, involving 1000s of people across multiple organisations making critical decisions on a daily basis. As well as mapping use-cases and devising technology solutions, we must also bring people along for the ride.

At Sensat, we believe software and data are vital components, but the human element remains central. We lead with an end-to-end approach, guiding use-case definition, implementing solutions, and facilitating digital change management. Ultimately, the true value of any digital innovation isn't in its technical sophistication, but in its capacity to deliver meaningful improvements to the way we design, build, operate, and interact with the world around us.

Want to talk about what this looks like in practice? Get in touch. We’d love to help out!