In this paper, we formulate seismic full waveform inversion within a deep learning environment. We are motivated both by the possibilities of incorporating the training of multiple datasets with the relatively low dimensionality of theoryguided network design and by the fact that by doing so we implement an FWI algorithm ready-made for new computational architectures. A recurrent neural network is set up with rules enforcing elastic wave propagation, with the wavefield projected onto a measurement surface acting as the labeled data to be compared with observed seismic data. Training this network amounts to carrying out elastic FWI. Based on the Automatic Differential method, the gradients can be accurately and efficiently constructed by inspection and use of the computational graph, a gradient which acts to update the elastic model. Under the theory-guided network design, the Automatic Differential method provide efficiency and flexibility for different misfits and parameterization alterations. We use different misfits, which are the l2, l1 and Huber norm, to improve the inversion results for parameters in eFWI. We also prepare our approach to mitigate cross-talk, which is a general property of multiparameter full waveform inversion algorithms, by allowing relative freedom to vary the eFWI parameterizations.

Presentation Date: Tuesday, October 13, 2020

Session Start Time: 1:50 PM

Presentation Time: 3:05 PM

Location: 351F

Presentation Type: Oral

This content is only available via PDF.
You can access this article if you purchase or spend a download.