Shale and tight reservoirs are characterized by complex geology, heterogeneous fracture networks, and transient flow regimes, which pose significant challenges for accurate and reliable production forecasting. Conventional methods, such as decline curve analysis, have limitations in capturing the physics of unconventional flow and handling the uncertainty and variability of the data. Full scale hydraulic fracture modeling and reservoir flow simulation, on the other hand, require many surveillance data for model calibration and a long turnaround for history matching, which became especially challenging for operators to scale to many wells and in a fast cadence to accommodate the rapid speed in shale and tight asset development.
In this paper, we present a workflow that combines probabilistic modeling and deep learning models trained on an ensemble of physics models to improve the scalability and reliability for shale and tight reservoir forecasting.
We construct generic reservoir models that can capture key first principles of unconventional well production mechanisms. PVT and pressure machine learning models are developed and incorporated into the reservoir models so that these generic models can capture fluid regimes and multiphase fluid behavior in different development areas and under different reservoir conditions, where measured data are not directly available. These models generate synthetic production curves that serve as digital analogs and augment field data. We then develop deep learning models to make probabilistic forecasting for new wells in the field.
We apply our approach to synthetic cases as well as to many wells in the Permian Basin. Through hindsight studies, we demonstrate that these models can generate realistic and diverse production curves, capture the physics of unconventional flow, quantify well production outlook uncertainty, and help interpretation of subsurface uncertainty.
In the realm of unconventional asset development, scalable forecasting is a key component in forecast reliability. While the detailed study of one well with lots of surveillance data will help provide the understanding of the mechanism of energy production, a consistent and informative forecasting for many wells without high fidelity data can provide a regional view of the asset, and hence is equally paramount. However, scalability remains hindered by data availability constraints, primarily due to the exorbitant costs associated with data acquisition. While physics models have long been employed for studying well performance, they typically require high fidelity input data from the subsurface and their application has predominantly been limited to individual well studies.