Loading…
Tuesday, October 26 • 1:15pm - 1:30pm
Neural Architecture Search for Inversion

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Feedback form is now closed.
Technical Presentations Group 3: Algorithms, Foundations, Visualizations, and Engineering Applications

Over the years, people have been using deep learning to tackle inversion problems, and we see the framework has been applied to build the relationship between recording wavefield and velocity (Yang et al., 2016). Here we will extend the work from 2 perspectives, one is deriving a more appropriate loss function, as we know, the pixel-2-pixel comparison might not be the best choice to characterize image structure, and we will elaborate on how to construct cost function to capture high-level feature to enhance the model performance. Another dimension is searching for the more appropriate neural architecture, which can be viewed as a subproblem within hyperparameter optimization, which is a subset of an even bigger picture, the automatic machine learning, or AutoML. There are several famous networks, U-net, ResNet (He et al. 2016), and DenseNet (Huang et al., 2017), and they achieve phenomenal results for certain problems, yet it’s hard to argue they are the best for inversion problems without thoroughly searching within certain space. Here we will be showing our architecture search results for inversion.

Authors: Xin Zhao (CGG), Licheng Zhang (University of Houston) and Cheng Zhan (Microsoft)

Speakers

Tuesday October 26, 2021 1:15pm - 1:30pm CDT

Attendees (3)