Abstract
Several investigators have reported coal permeability decreases with increasing stress, but no conceptual model has been advanced to explain this effect. To better understand the permeability of stressed coal, a theoretical and experimental program was undertaken. A common naturally fractured reservoir geometry, a collection of matchsticks, was extended to stressed coalbeds and tested against laboratory measurements using samples from the San Juan and Warrior Basins. Good agreement was obtained between theoretical behavior and laboratory data. Equations are presented for converting laboratory measured stress-permeability data to (a) in-situ permeability as a function of depth of burial in a basin, and (b) to reservoir permeability during coalbed depletion.
Coal cleat compressibility, analogous to pore volume compressibility of conventional reservoirs, has historically been difficult and expensive to measure and the results of such measurements are often ambiguous. A method is presented for calculating cleat volume compressibility from stress permeability experiments, resulting in considerable savings of both time and money. Stress-permeability and cleat volume compressibility results reported here are compared with those published in the literature.
Evidence in the literature indicates that coal matrix shrinks when gas is desorbed, increasing cleat permeability. Assuming a matchstick geometry and using a coal matrix shrinkage coefficient reported in the literature, the increase in cleat permeability due to matrix shrinkage was calculated. The increase in permeability due to matrix shrinkage during depletion is compared with the decrease in permeability due to increased stress.