hide
Free keywords:
-
Abstract:
Geomechanical–numerical modeling aims to describe the stress field within rock volumes, using stress magnitude data records for model calibration. However, the high cost of data records acquisition frequently results in sparse datasets. Moreover, in-situ stress measurements are conducted at a meter scale, limiting their representativity for larger rock volumes. The question of how many stress magnitude data records are needed to prevent introducing additional uncertainty due to data sparseness has not yet been addressed. To assess how the number of stress magnitude data records affects the accuracy of predictions, we use a unique calibration dataset from site explorations for a deep geological repository in northern Switzerland. In the Zürich Nordost siting region, this dataset comprises 30 minimum horizontal stress (Shmin) measurements and 15 maximum horizontal stress (SHmax) estimates, collected across a succession of Mesozoic sediments with varying rock mass properties. The stiffness variability within and across different rock formations controls the ranges of expected stresses. We develop a numerical framework using multiple model simulations with incrementally increasing calibration data records to assess the impact on predicted stress magnitudes. We identify the minimum number of data records needed to achieve a modeled stress range narrower than the stress range resulting from stiffness variability. Furthermore, our framework can objectively identify an outlier in the calibration dataset, linked to a local stiffness anomaly in the Opalinus Clay. Such an outlier has a significant impact on the modeled stress ranges and the model accuracy when using small numbers of calibration data records. This study offers valuable insights for subsurface projects, such as energy storage, CO2 sequestration, and tunneling, where stress predictions with quantified uncertainty are critical.