Reservoir Simulation Software: Profiting Through Performance

In recent years, the way in which oil and gas companies value and use data in their business has evolved rapidly.

Bg blog profiting through performance
|

Posted in: ECHELON Software

In recent years, the way in which oil and gas companies value and use data in their business has evolved rapidly. Some of this has been driven by the 2015 downturn and the unrelenting pressure on energy companies to become more efficient and to lower costs. These efforts will become even more critical as the industry climbs out of the current slump caused by the COVID-19 pandemic. Historically cautious energy companies are now aggressively pursuing new technologies that can accelerate this process. This especially holds true in the way companies generate, process and use data to make more rapid and statistically sound decisions that can impact their business. An illustrative example is the emergence of new technologies such as machine learning, and the changing ways that companies use existing tools such as reservoir simulation software.

A key goal in reservoir simulation is reducing the uncertainty in forecasting. Uncertainty is introduced to reservoir modelling primarily by the incomplete or imprecise knowledge obtained from subsurface measurements. The seismic and well data used to create reservoir models is by nature sparse and overlaid with many assumptions and approximations that guide its filtering and analysis. It is important to properly represent the uncertainty in the reservoir modelling process so that decision-makers understand the risk associated with each decision.

The traditional approach to modelling dynamic reservoir models relies on a single model or a small number of scenarios that are represented with a high, medium, and low probability. These models are used to represent the ‘best guess’ of the features of the reservoir and are used to make production and investment decisions for the asset. By using such a small sample size of models to represent the reservoir, engineers are thinly sampling the space of possible outcomes. The bottleneck for using a larger sample size of data has historically been due to the limitations of the reservoir simulation tools available in the industry; there was simply not enough time or resources to carry out a complete survey of model uncertainty. This has begun to change in the last decade for two essential reasons.

High Performance Computing

The first is the evolution of the high performance computing (HPC) industry and the emergence of faster and cheaper hardware. As the increase in clock speeds of central processing units (CPUs) began to level off in the mid-2000s, the HPC market shifted to multi-core development by putting multiple cores on a single processor socket server. This led to a dramatic performance increase for processes that were able to be executed on several cores simultaneously. Performance continued to increase as more cores were added with each new generation of processors. As the market moved into multi-core development another key technology, graphics processing units (GPUs), emerged in the HPC industry. GPUs contain thousands of small, efficient cores that work simultaneously. They were traditionally used for fast 3D game rendering but began to be harnessed more broadly to accelerate computational workloads. Not all applications could take advantage of this new hardware, but those that could showed remarkable speedup. The top commercial supercomputers in the industry, such as Eni's HPC5 and Total's Pangea III, are both massive GPU-based clusters. The emergence of cloud services has also shaped the HPC market by making modern hardware more accessible to companies. This is especially true for small to mid-sized oil and gas companies that are limited by the large upfront cost of traditional on-premise HPC systems. By using the cloud, these companies have access to the latest hardware generations at an entry price point that is much lower than on-premise systems. Reservoir simulation software is a natural fit for the cloud due to its cyclical usage. The duty cycle of reservoir simulation within most companies dramatically shifts up and down as projects and deadlines come and go. The option to only pay for the systems when they are being used is appealing to many companies, especially if their reservoir simulation usage changes month to month. The drawbacks of using applications such as reservoir simulation in the cloud are based around security concerns and the economics in high usage cases. Even so, there is no denying that cloud technology has had a profound impact on the HPC community. Australian oil and gas producer Woodside is an example of a company that now runs all of its HPC exclusively on the cloud. They have found that the burst-like nature of reservoir simulation is well-matched to the dynamics of the cloud. While on one day there may be no simulations required, the next may demand tens of thousands of concurrent models. Costs are more directly tied to the duration and resources consumed by each simulation as opposed to on-premise options, however the speed at which results are generated from inputs due to parallel execution means decisions are accelerated – the value of which is much higher than the incurred cost of immediately scalable simulation. Studies are no longer limited by simulation size nor scale, but by the ability to digest the generated data and infer meaningful insights given the uncertainty; the bottleneck has moved. According to Hyperion Research, spending for HPC work in the cloud is expected to grow from US$2.5 billion spent in 2018 to US$7.4 billion in 2023.

Parallel reservoir simulators

The second reason for the improved capability of reservoir simulation software is the emergence of new parallel simulators that are able to employ the advantages offered by the modern hardware. Although processes such as seismic imaging are naturally amenable to massive parallelism, it is more challenging to expose such parallelism in reservoir simulation. It took more time and effort to create parallel reservoir simulators, which is evident in how companies have used their HPC systems over the years. Historically, companies have dedicated the majority of computing resources to the seismic imaging process, with reservoir simulation being a distant second. This scenario is gradually changing as energy companies begin to use more parallel simulators and move to more probabilistic methods of reservoir modelling.

The increase in computing power is changing how companies view the use of reservoir simulation software. Instead of using one or a few models to represent the reservoir, they are moving towards more statistical methods, such as ensemble modelling. Ensemble modelling is a technique where thousands of different realizations of a model are simulated to provide an envelope of possible outcomes with probabilistic weighting. Ensemble modelling recognizes and embraces uncertainty, and provides statistical bounds on future production. This enables companies to better understand the uncertainty associated with the reservoir and avoid more ad-hoc assumptions during the decision-making process. It also assists the machine learning and artificial intelligence methods used by oil companies by creating the large sets of data that are required. Methods like ensemble modelling or uncertainty quantification require heavy computing power, which has historically limited their use in traditional reservoir simulation. This burden has now been mitigated and companies such as Eni, with its new GPU-based HPC5 supercomputer, now choose the best development scenario by creating ensembles of models and running hundreds to thousands of simulations.

Companies are also moving towards developing and modelling larger, more fine-grained models. Traditional reservoir simulation involves the process of upscaling, where detail is removed from large geological models to create smaller simulation models that are faster and more manageable. Modern parallel reservoir simulators, such as the GPU-based ECHELON software, enable companies to dispense with the upscaling process and instead quickly simulate the full geologic size model. Geologic complexity is a very important factor that controls long-term recovery. Maintaining the complexity developed in modern geologic modelling tools can be very important for understanding and optimizing recovery. Reservoir simulators such as ECHELON allow companies to model these large, complex systems at speeds that enable the practical simulation of hundreds or thousands of ensemble realizations. More detailed and higher resolution models provide engineers and managers with additional critical subsurface data that informs decision-making. This is true even for very small companies, such as Denver-based iReservoir consulting, where models of several million or more active cells have become routine with the use of ECHELON software on small workstations. In another example, Houston-based Marathon Oil Company uses ECHELON software to run models with tens of millions of cells in full-field simulations that include multiple wells with complex fracture geometry.

The performance of modern parallel reservoir simulators has also led to an increased use of more complex problems, such as compositional modelling. Compositional modelling allows engineers to track how the chemical composition of the hydrocarbon changes throughout the production process. This is important in cases, such as CO2 flooding, where the changing composition of the hydrocarbon mix can dramatically impact the recovery. This type of modelling is very compute-intensive and thus requires much longer run times than less complex simulation runs. Because of this, engineers have historically avoided compositional modelling when possible by making assumptions and limiting the model complexity. This adds to the uncertainty in the model and negatively affects business decisions.

Conclusion

Forces on both the demand and the supply side have impacted the role of reservoir simulation software in the energy industry. On the demand side there is a growing emphasis on ensemble methods, larger models and more complex physics. All three drive the need for fast, scalable simulation of the type offered by reservoir simulators like ECHELON. On the supply side multi-core CPUs and GPUs have emerged as mature foundational platforms for scientific computing. These new technologies generate critical information for better decision-making and cost savings in an industry where even tiny improvements in efficiency or production can provide huge rewards.


Author
Brad Tolbert

Brad Tolbert

Brad Tolbert heads up the Sales and Marketing team at Stone Ridge Technology.

Subscribe for Updates

Stone Ridge Technology – ECHELON Advantages

Recent Articles

What we are doing to help improve the reservoir simulation industry.