I gave this presentation at the Department of Geology and Geophysics TGIF Seminar series shortly after arriving at the University of Hawaii for my postdoc to work on GMT. The last interesting results that I'd had were from my PhD thesis so I thought I'd present that, though heavily edited in the interest of time (my thesis presentation was about 1h 20min).
I gave a few live demos of Fatiando a Terra using Jupyter notebooks during the talk:
The inner density distribution of the Earth can be inferred from disturbances in its gravitational field. However, accomplishing this is never easy. There are many possible parameterizations for the mathematical model, which is often non-linear. To make matters worse, gravity data alone do not contain enough information to obtain a unique and stable solution. One must add independent information to constrain the solution space, often in the form of regularization. Many different methods for performing this inference have been developed and research in this field is still active. Investigating new methodologies implies developing complex software, which often must be able to deal with sparse matrices and parallelism. I’ll present the open-source Python library Fatiando a Terra. It implements many of the components required for developing inversion methods, such as forward modeling, data processing and I/O, and regularization. I’ll also show how I used this library to develop a computationally efficient method for estimating the Moho depth from gravity data using a spherical approximation of the Earth.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>Back in July of 2016, I applied for a postdoc position to build Python bindings for the Generic Mapping Tools (GMT) software with Professor Paul Wessel at the University of Hawaii. The short version is that I got the position, asked for a 1 year leave from UERJ (where I currently work as a professor), came to Hawaii, and right now I'm starting to get familiar with the GMT code base.
Read on for the long version.
The The Pacific Ocean Science and Technology (POST) building in the UH Manoa campus. My office is on the top floor with a nice view of downtown Honolulu.
After two and half years as a Professor at UERJ, struggling to teach classes without any experience or time to prepare, all while finishing my PhD, I was feeling a bit burned-out and eager to do something different. Meanwhile, Brazil was (and still is) in a huge economic and political crisis and Rio was hit pretty hard. The University, which is State owned, was on strike from March 2016 until after the Olympic games in August. The time was ripe to apply for something different abroad.
I signed up for a couple of mailing lists where people post job opportunities just to see what was out there. If you're looking for a new position or just want to get a feeling for what is currently on demand, I highly recommend signing up for the Computational Infrastructure for Geodynamics (CIG) list and the Earth Sciences Job Email list (ES-JOBS). I get at least 5-10 emails on the ES-JOBS list per day (you might want to redirect them to a folder to keep your inbox from exploding). After about a month on the lists, this message from Paul came on the CIG list:
A full-time postdoctoral position is available in the Department of Geology and Geophysics at the University of Hawaii at Mānoa to participate in funded research in support of the expansion of the Generic Mapping Tools (GMT) to Python (possibly via Cython), with applications in plate tectonics and geodynamics. A one-year initial appointment is anticipated, with the possibility of a second year extension, depending on progress and availability of funds.
The successful applicant will be a highly motivated, independent researcher with extensive programming experience (preferably in C) and Python scripting and will assist Dr. Wessel and the GMT team in developing the GMT/Python API. Applicants must have completed a PhD in the physical sciences at the time of appointment, with a preference for geophysics, and should be proficient in spoken and written English. The position is open immediately and will remain open until an appointment is made.
To apply, please send a curriculum vitae, a brief (1 page) statement of research objectives, a brief (1 page) statement of skills or experience suitable for contributing to GMT development, and the names of three references to Dr. Paul Wessel. Questions should also be addressed to Dr. Wessel directly via e-mail. Information on the Department can be found at http://www.soest.hawaii.edu/GG. The University is an Equal Opportunity/Affirmative Action Institution.
The requirements seemed to fit me perfectly. My Bachelor's thesis was to developed a C program and I spent most of my PhD building a Python library. After some careful consideration with my wife and bit of hesitation, I decided to apply for the position. I consulted my department and they generously agreed to cover my geophysics couses (1 and 2) during the year that I would be away. So I sent in my CV, the statement of research objectives, and the statement of skills. As always, the LaTeX sources for all three are on a Github repository if you want to have a look or need a template to get started.
I did a Skype interview with the very friendly GMT team and after a while I got an email saying that the position was mine! I finished my responsibilities for 2016 and took some vacation time to sort out the trip. In the middle of February I hopped on a quick ~24h trip from São Paulo to Honolulu and now here I am.
My new desk at UH with a great view of the tall buildings of Waikiki and downtown Honolulu on a nice rainy day.
The goal of the project is to build a bridge between GMT (a set C-coded command-line programs) and the Python programming language. GMT is arguably the best map plotting software around and it certainly makes the most beautiful maps. This bridge will bring that power to the Python community.
The way I'm currently exploring for doing this is to hook into the
GMT C API (the internal
functions that GMT exposes through a shared library)
using the ctypes
package from the Python standard library.
This way, a user could call the internal GMT functions from a Python program
or, even better, from a Jupyter notebook.
This will serve as a basis for building a more Pythonic higher-level library
for GMT.
Hopefully this will help smooth the rough edges of the GMT command-line
interface that cuts newbies and gurus alike.
After all, who could possibly remember what -DjTR+o0.3i/0.1i+w4i/0.2i+h
is
supposed to
do?
Right now, I'm still struggling with CMake configuration issues, linking
problems, incompatibilities with
Anaconda, and other pleasant things
that come with dealing with compiled code.
But I'll write more about that in a later post.
Now, I wonder where I put that libgdal.so
file?
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>I get asked a lot in the Fatiando a Terra mailing list how to do some basic Python and numpy tasks which are not necessarily related to Fatiando. The most common question is some variant of "I have some data in a csv/txt/xyz file and I want to load it into Python". I think this happens because a lot of people find the project while searching for a replacement for GUI based commercial projects, like Geosoft's Oasis Montaj, but they don't necessarily know Python. So instead of writing yet another email, I decided to "Reply to public" here.
Here are my recommendations (in order):
numpy.loadtxt
).From now on, learning new things will be a continuous process. I've been programming Python for 10 years and every once in a while I'll still learn something new, usually that reduces the amount of code I have to write (less code = less bugs). The key is to stay informed and you can do that by subscribing to some (or all) of the following:
Now go out there and learn a skill that just might save you in these times of crisis!
How did you get started with Python? Do you have anything to add to this list? Let me know!
The open science logo is by G.emmerich on Wikimedia Commons and the picture of the Python book is by Marcus Brown. Because both are licensed CC-BY-SA, then so is the thumbnail image for this blog post (a composite of the two images).
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This is a part of The Leading Edge tutorials series. All tutorials are open-access and include open-source code examples.
The manuscript was written in Authorea. You can view and comment on the text online at Authorea and even edit it on the SEG Wiki. The final (pretty) PDF version is free to download from the publisher website (follow the doi link).
The Jupyter notebook that accompanies the tutorial (see the source code repository on Github) contains the full source code, along with documentation and tests. Both figures of the tutorial are produced by the code in the notebook.
The code and idea for this tutorial came from my Geofísica 2: Sismologia e sísmica course. I came across the problem of implementing NMO correction while preparing my lecture and practical exercises on this topic. This is a clear example of how learning happens both ways in a classroom.
Open any text book about seismic data processing and you will inevitably find a section about the normal moveout (NMO) correction. When applied to a common midpoint (CMP) section, the correction is supposed to turn the hyperbola associated with a reflection into a straight horizontal line. What most text books won't tell you is how, exactly, do you apply this equation to the data?
That is what this tutorial will teach you (hopefully).
Uieda, L. (2017), Step-by-step NMO correction, The Leading Edge, 36(2), 179-180, doi:10.1190/tle36020179.1
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>Install Termux from Google Play, open it and run:
$ apt install clang python python-dev fftw libzmq libzmq-dev freetype freetype-dev libpng libpng-dev pkg-config $ LDFLAGS=" -lm -lcompiler_rt" pip install numpy matplotlib pandas jupyter $ jupyter notebook
Copy the URL printed to the screen (it will look something like
http://localhost:8888/?token=longstringofcharacters
)
and paste it into Chrome/Firefox. Enjoy!
Read on for more tips and a few tweaks.
UPDATE (25-01-2017): There were a few dependencies that I had left out of the instructions for installing numpy et al. I edited the post to make things more complete and clear.
I bought my first tablet last October, an NVIDIA Shield K1. I had been putting off getting one because I never could think of a good use for them. I have my phone for messaging and Internet, my kindle for reading, and my Linux laptop for working. It seemed to me that the tablet would be a nice toy but not something I could use and justify the purchase.
The dream would be to be able to ditch my laptop and do actual work on the tablet. Mark O'Connor wrote about doing just that on Yield Thought but he cheats a bit by running everything on a Linode server. And how does anyone do scientific programming these days without a Jupyter notebook?
I finally gave in, thinking that I would use the tablet mainly for reading papers and taking notes. Maybe even play a few games. Then I discovered Termux.
Here is a screencast of me running a Jupyter notebook
server on my tablet.
Notice that the URL is localhost:8888/
so this is not a remote server.
Install Termux from Google Play and open it. You'll be dropped into a bash terminal, like the one below.
Termux uses the apt
package manager so you can install packages pretty much
like you would on Debian/Ubuntu.
The first thing I do on any new computer is install git so that I can fetch my configuration files from Github:
$ apt install git
Before cloning the repository, I need to generate a new SSH key (only required if you use the SSH protocol with git):
$ apt install openssh $ ssh-keygen -t rsa -b 4096 -C "your_email@example.com" $ cat .ssh/id_rsa.pub # copy and paste your public key to Github
Then I can clone my dotfiles repository:
$ git clone git@github.com:leouieda/dotfiles.git
Now my Termux terminal looks just like my Linux terminal on my laptops.
If you're from the pre-Anaconda era, you'll probably remember the frustration
of trying to pip install numpy scipy matplotlib
.
Sadly, there is no Anaconda for Termux so we're stuck with using the system
python and pip
to install packages.
But don't despair! Things work more smoothly these days (if you follow the magic incantations). Sadly, the scipy library itself so far can't be installed without significant effort. Even then you might not be able to do it because of all the Fortran requirements (BLAS, LAPACK, and gfortran). So for now, we have to make do with numpy only.
First, we must install python it self (version 3.6), the headers files, a C compiler, and the FFTW package from Termux:
$ apt install python python-dev clang fftw
Now we can install numpy using pip:
$ LDFLAGS=" -lm -lcompiler_rt" pip install numpy
For matplotlib, we'll need to install a few more dependencies:
$ apt install freetype freetype-dev libpng libpng-dev pkg-config $ LDFLAGS=" -lm -lcompiler_rt" pip install matplotlib
And for Jupyter we need to install the zmq library as well:
$ apt install libzmq libzmq-dev $ LDFLAGS=" -lm -lcompiler_rt" pip install jupyter
Finally, we can get pandas:
$ LDFLAGS=" -lm -lcompiler_rt" pip install pandas
Now you have access to things like ipython
on the command-line:
One thing that won't work are matplotlib plots because there is no backend for
Android.
You can, however, use %matplotlib inline
or %matplotlib notebook
inside
Jupyter notebooks to get the plots working.
Using plt.savefig
without using plt.show()
should also work.
To get a Jupyter notebook server running, so the same thing you would on any other computer:
$ jupyter notebook
The server won't automatically open a browser but you can copy the URL from the output and paste it into Chrome or Firefox.
While it is possible to do some work using this setup (I wrote part of this post on the tablet using Vim and pushing to the website's Github repo), it may not be the most productive environment. Here are a few tips for making life a little bit easier.
This setup works and is way beyond what I expected to be able to accomplish with a $200 tablet. However, going back to pip installing numpy feels a bit like I'm back in 2010. What I've missed the most is Anaconda and the conda package manager. Having a prebuilt bundle certainly makes life a lot easier. But I miss the conda environments the most. I use them extensively for my projects and papers.
The scipy package. So yeah, that is still missing as well. A lot of things can
be done using numpy replacements (numpy.fft
instead of scipy.fftpack
etc)
though they are usually slower.
Another recent arrival that has made a huge impact on my daily work is conda-forge. This project greatly democratizes conda packages. Now anyone can build their own packages for Linux, Windows, and Mac. It would be awesome to have some for Android as well. Assuming that you can get conda installed, the major difficulty might be finding a continuous integration service that runs Android and setting up the infrastructure.
Let me know if you try this out! Is there another setup that you use? What else is missing? Do you think we'll be able to fully work like this one day?
The Jupyter logo was downloaded from their Github repository (jupyter/design). The Android logo is CC BY 2.5 Google Inc., via Wikimedia Commons.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>Last week, John Leeman wrote a list of podcasts recommendations and I thought I'd share mine here as well. I have been listening to podcasts since around 2014 and I've experimented with a few before finding some that I really enjoy. I still subscribe to some new podcasts when I find them. I'll listen to a couple of episodes but lately none have stuck with me¹.
Here are the ones that stayed with me throughout 2016:
And here are the ones that I tried but ended up dropping for some reason:
I subscribe and listen to all of them using the Podcast Addict app on my Android phone. You can also get most of them through Google Play Music or iTunes.
What are your recommendations?
¹ I know I've doing too much Python when I have to fight the urge to capitalize "none".
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This paper is a chapter of my PhD thesis. It describes a new gravity inversion method to estimate the depth of the crust-mantle interface (the Moho). The inversion uses a spherical Earth approximation by discretizing the Earth into tesseroids (spherical prisms). The forward modeling method used is described in the paper Tesseroids: forward modeling gravitational fields in spherical coordinates. We applied the inversion to estimate the Moho depth for South America.
The main result from this publication is the gravity-derived Moho depth model for South America and the differences between it and seismological estimates of Assumpção et al. (2013). These differences allow us to know where the gravity-derived model can be trusted and where there might be unaccounted sources in the gravity data.
You can download the model results, source code, and input data from doi:10.6084/m9.figshare.3987267
[high resolution version]
Figure caption:
Dotted lines represent the boundaries between major geologic provinces.
AD: Andean Province, AFB: Andean foreland basins, AM: Amazonas Basin, BR:
Brazilian Shield, BO: Borborema province, CH: Chaco Basin, GB: Guyana
Basin, GU: Guyana Shield, PB: Parnaíba Basin, PC: Parecis Basin, PR:
Paraná Basin, PT: Patagonia province, SF: São Francisco Craton, SM:
Solimões Basin. Solid orange lines mark the limits of the main
lithospheric plates. AF: Africa Plate, AN: Antarctica Plate, CA:
Caribbean Plate, CO: Cocos Plate, SA: South America Plate, SC: Scotia
Plate, NZ: Nazca Plate. The solid light grey line is the 35 km Moho
depth contour.
The inversion method proposed here is implemented in the Python programming language. The code uses the forward modeling and inversion packages of the library Fatiando a Terra: modeling and inversion for geophysics.
You'll find the source code, input data, and instructions to produce the results from the paper on the Github repository. There should be enough information for you to produce all figures of the paper.
You can run the Jupyter notebooks online without installing anything
thanks to the awesome free Binder web service.
Follow the link below and open any notebook in the code
folder. Beware that
the CRUST1.0 synthetic and the South American Moho results will take hours or
days to run.
mybinder.org/repo/pinga-lab/paper-moho-inversion-tesseroids
Estimating the relief of the Moho from gravity data is a computationally intensive non-linear inverse problem. What is more, the modeling must take the Earths curvature into account when the study area is of regional scale or greater. We present a regularized non-linear gravity inversion method that has a low computational footprint and employs a spherical Earth approximation. To achieve this, we combine the highly efficient Bott's method with smoothness regularization and a discretization of the anomalous Moho into tesseroids (spherical prisms). The computational efficiency of our method is attained by harnessing the fact that all matrices involved are sparse. The inversion results are controlled by three hyper-parameters: the regularization parameter, the anomalous Moho density-contrast, and the reference Moho depth. We estimate the regularization parameter using the method of hold-out cross-validation. Additionally, we estimate the density-contrast and the reference depth using knowledge of the Moho depth at certain points. We apply the proposed method to estimate the Moho depth for the South American continent using satellite gravity data and seismological data. The final Moho model is in accordance with previous gravity-derived models and seismological data. The misfit to the gravity and seismological data is worse in the Andes and best in oceanic areas, central Brazil and Patagonia, and along the Atlantic coast. Similarly to previous results, the model suggests a thinner crust of 30-35 km under the Andean foreland basins. Discrepancies with the seismological data are greatest in the Guyana Shield, the central Solimões and Amazonas Basins, the Paraná Basin, and the Borborema province. These differences suggest the existence of crustal or mantle density anomalies that were unaccounted for during gravity data processing.
Uieda, L., and V. C. F. Barbosa (2017), Fast nonlinear gravity inversion in spherical coordinates with application to the South American Moho, Geophys. J. Int., 208(1), 162-176, doi:10.1093/gji/ggw390.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This paper is a chapter of my PhD thesis.
It describes the algorithms used in version 1.2.0 of the open-source
software Tesseroids: gravity forward modeling in spherical coordinates.
The software is a suite of C coded command-line programs that calculate the
gravitational field of a tesseroid (spherical prism) model.
There is also a separate Python implementation of the same algorithm in the
fatiando.gravmag.tesseroid
module of the open-source library
Fatiando a Terra: modeling and inversion for geophysics (introduced in version 0.3).
Presentations about the modeling methods and previous versions of the software:
We present the open-source software Tesseroids, a set of command-line programs to perform the forward modeling of gravitational fields in spherical coordinates. The software is implemented in the C programming language and uses tesseroids (spherical prisms) for the discretization of the subsurface mass distribution. The gravitational fields of tesseroids are calculated numerically using the Gauss-Legendre Quadrature (GLQ). We have improved upon an adaptive discretization algorithm to guarantee the accuracy of the GLQ integration. Our implementation of adaptive discretization uses a "stack" based algorithm instead of recursion to achieve more control over execution errors and corner cases. The algorithm is controlled by a scalar value called the distance-size ratio (D) that determines the accuracy of the integration as well as the computation time. We determined optimal values of D for the gravitational potential, gravitational acceleration, and gravity gradient tensor by comparing the computed tesseroids effects with those of a homogeneous spherical shell. The values required for a maximum relative error of 0.1% of the shell effects are D = 1 for the gravitational potential, D = 1.5 for the gravitational acceleration, and D = 8 for the gravity gradients. Contrary to previous assumptions, our results show that the potential and its first and second derivatives require different values of D to achieve the same accuracy. These values were incorporated as defaults in the software.
Uieda, L., V. Barbosa, and C. Braitenberg (2016), Tesseroids: Forward-modeling gravitational fields in spherical coordinates, GEOPHYSICS, F41–F48, doi:10.1190/geo2015-0204.1.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>Airborne gravity gradiometry data have been recently used in mining surveys to map the 3D geometry of ore deposits. This task can be achieved by different gravity-gradient inversion methods, many of which use a voxel-based discretization of the Earth's subsurface. To produce a unique and stable solution, an inversion method introduces particular constraints. One constraining inversion introduces a depth-weighting function in the first-order Tikhonov regularization imposing a smoothing on the density-contrast distributions that are not restricted to near-surface regions. Another gravity-gradient inversion, the method of planting anomalous densities, imposes compactness and sharp boundaries on the density-contrast distributions. We used these two inversion methods to invert the airborne gravity-gradient data over the iron-ore deposit at the southern flank of the Gandarela syncline in Quadrilátero Ferrífero (Brazil). Because these methods differ from each other in the particular constraint used, the estimated 3D density-contrast distributions reveal different geologic features of ore deposit. The depth-weighting smoothing inversion reveals variable dip directions along the strike of the retrieved iron-ore body. The planting anomalous density inversion estimates a compact iron-ore mass with a single density contrast, which reveals a variable volume of the iron ore along its strike increasing towards the hinge zone of the Gandarela syncline which is the zone of maximum compression. The combination of the geologic features inferred from each estimate leads to a synergistic effect, revealing that the iron-ore deposit is strongly controlled by the Gandarela syncline.
Carlos, D. U., L. Uieda, and V. C. F. Barbosa (2016), How two gravity-gradient inversion methods can be used to reveal different geologic features of ore deposit — A case study from the Quadrilátero Ferrífero (Brazil), Journal of Applied Geophysics, doi:10.1016/j.jappgeo.2016.04.011.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>The method described in this article has been implemented in the open-source
geophysics library Fatiando a Terra.
The method was first introduced in
version 0.3
as the fatiando.gravmag.magdir.DipoleMagDir
class
(see PR 87
for the full development history of this implementation).
See the project documentation
and the code repository of this paper
(pinga-lab/Total-magnetization-of-spherical-bodies)
for more information about using this class.
This paper has undergone open peer-review. The original submission, reviews, and replies can be viewed at doi:10.5194/npgd-1-1465-2014.
We have developed a fast total-field anomaly inversion to estimate the magnetization direction of multiple sources with approximately spherical shapes and known centres. Our method is an overdetermined inverse problem that can be applied to interpret multiple sources with different but homogeneous magnetization directions. It requires neither the prior computation of any transformation-like reduction to the pole nor the use of regularly spaced data on a horizontal grid. The method contains flexibility to be implemented as a linear or non-linear inverse problem, which results, respectively, in a least-squares or robust estimate of the components of the magnetization vector of the sources. Applications to synthetic data show the robustness of our method against interfering anomalies and errors in the location of the sources' centre. Besides, we show the feasibility of applying the upward continuation to interpret non-spherical sources. Applications to field data over the Goiás alkaline province (GAP), Brazil, show the good performance of our method in estimating geologically meaningful magnetization directions. The results obtained for a region of the GAP, near to the alkaline complex of Diorama, suggest the presence of non-outcropping sources marked by strong remanent magnetization with inclination and declination close to −70.35 and −19.81°, respectively. This estimated magnetization direction leads to predominantly positive reduced-to-the-pole anomalies, even for other region of the GAP, in the alkaline complex of Montes Claros de Goiás. These results show that the non-outcropping sources near to the alkaline complex of Diorama have almost the same magnetization direction of those ones in the alkaline complex of Montes Claros de Goiás, strongly suggesting that these sources have been emplaced in the crust within almost the same geological time interval.
@article{oliveira_jr._estimation_2015, title = {Estimation of the total magnetization direction of approximately spherical bodies}, volume = {22}, issn = {1607-7946}, doi = {10.5194/npg-22-215-2015}, number = {2}, journal = {Nonlin. Processes Geophys.}, author = {Oliveira Jr., V. C. and Sales, D. P. and Barbosa, V. C. F. and Uieda, L.}, year = {2015}, pages = {215--232}, }
Oliveira Jr., V. C., D. P. Sales, V. C. F. Barbosa, and L. Uieda (2015), Estimation of the total magnetization direction of approximately spherical bodies, Nonlin. Processes Geophys., 22(2), 215-232, doi:10.5194/npg-22-215-2015.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This is a presentation I gave for the Department of Geophysics of the University of São Paulo. It's about my open-source project Fatiando a Terra and how I'm using for teaching geophysics and doing my own research on inverse problems.
In the GitHub repository you'll find the slides and accompanying IPython notebooks for the demos.
O Fatiando a Terra (www.fatiando.org) é uma biblioteca feita na linguagem Python que tem como objetivo facilitar o trabalho de pesquisadores e professores na área geofísica. Os módulos da biblioteca foram planejados para facilitar a combinação de seus componentes de diversas formas. Por exemplo, o mesmo módulo de modelagem direta pode ser usado para produzir dados sintéticos, desenvolver um método de inversão ou como parte de uma interface gráfica interativa. Além disso, as funções da biblioteca podem ser combinadas com funções desenvolvidas pelo usuário e com as muitas bibliotecas científicas da linguagem Python. O módulo de problemas inversos automatiza grande parte da implementação de um novo método de inversão. O pesquisador implementa somente o cálculo de dados preditos e da matriz de sensibilidade, ambos reutilizando os diversos módulos de modelagem direta. Com essas duas funções, o usuário pode escolher livremente entre diversos métodos de optimização e regularização para executar sua inversão.
Para o ensino de geofísica, a biblioteca pode ser combinada com a interatividade de outros programas, particularmente o IPython notebook (www.ipython.org). Conceitos difíceis de serem transmitidos em aula podem ser explorados pelos alunos de forma interativa, com botões, gráficos e animações. Por exemplo, para ensinar a reflexão e refração de ondas sísmicas, o professor pode utilizar simulações numéricas da propagação de ondas para produzir animações em tempo real. Outro exemplo é permitir aos alunos explorar como o campo geomagnético interage com um corpo geológico a diferentes latitudes para produzir uma anomalia magnética de campo total. Dessa forma, os alunos ganham experiência e intuição ao interagir com os resultados.
A implementação de diversos métodos geofísicos em uma única biblioteca fornece a base necessária para a rápida criação de novas metodologias e material didático interativo. A maior parte da funcionalidade atual é para gravimetria e magnetometria, embora já exista um núcleo de sísmica e sismologia que está sendo desenvolvido. O projeto necessita de usuários e desenvolvedores para crescer e abranger os demais ramos da geofísica. O projeto é software livre e contribuições de qualquer forma são bem vindas.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This paper uses the planting inversion proposed in
our 2012 paper.
We used the implementation in the open-source
Fatiando a Terra Python library
(the fatiando.gravmag.harvester
module).
The module was introduced in
version 0.1
of the library.
The Quadrilátero Ferrífero in southeastern Brazil hosts one of the largest concentrations of lateritic iron ore deposits in the world. Our study area is over the southern flank of the Gandarela syncline which is one of the regional synclines of the Quadrilátero Ferrífero. The Gandarela syncline is considered the Brazilian megastructure with the highest perspectives for iron ore exploration. Most of the iron-ore deposits from the Quadrilátero Ferrífero are non-outcropping bodies hosted in the oxidized, metamorphosed and heterogeneously deformed banded iron formations. Therefore, the assessment of the 3D geometry of the iron-ore body is of the utmost importance for estimating reserves and production development planning. We carried out a quantitative interpretation of the iron-ore deposit of the southern flank of the Gandarela syncline using a 3D inversion of airborne gravity-gradient data to estimate the shape of the iron-ore mineralization. The retrieved body is characterized by a high-density zone associated with the northeast-elongated iron formation. The thickness and the width of the retrieved iron-ore body vary along its strike increasing southwestward. The presence of a large volume of iron ore in the southwest portion of the study area may be due to the hinge zone of the Gandarela syncline, which is the zone of maximum compression. Our estimated iron-ore mass reveals variable dip directions. In the southernmost, central and northernmost portions of the study area, the estimated iron body dips, respectively, inwards, vertically and outwards with respect to the syncline axis. Previous geological mapping indicated continuous mineralization. However, our result suggests a quasi-continuous iron-ore body. In the central part of the study area, the estimated iron-ore body is segmented into two parts. This breakup may be related to the northwest-trending faults, which are perpendicular to the northeast-trending axis of the Gandarela syncline. Our estimated iron-ore mass agrees reasonably well with the information provided from the lithologic logging data of drill holes. In this geophysical study, the estimated iron-ore reserves are approximately 3 billion tons.
@article{carlos2014, title = {Imaging iron ore from the {Quadrilátero} {Ferrífero} ({Brazil}) using geophysical inversion and drill hole data}, volume = {61}, issn = {0169-1368}, doi = {10.1016/j.oregeorev.2014.02.011}, journal = {Ore Geology Reviews}, author = {Carlos, Dionísio U. and Uieda, Leonardo and Barbosa, Valeria C. F.}, month = sep, year = {2014}, pages = {268--285} }
Carlos, D. U., L. Uieda, and V. C. F. Barbosa (2014), Imaging iron ore from the Quadrilátero Ferrífero (Brazil) using geophysical inversion and drill hole data, Ore Geology Reviews, 61, 268-285, doi:10.1016/j.oregeorev.2014.02.011
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>Inverse problems haunt the nightmares of geophysics graduate students. I'll demonstrate how to conquer them using Fatiando a Terra. The new machinery in Fatiando contains many ready-to-use components and automates as much of the process as possible. You can go from zero to regularized gravity inversion with as little as 30 lines of code. I'll walk through an example to show you how.
The inner properties of the Earth can usually only be inferred through indirect measurements of their effects. For example, density variations cause disturbances in the gravity field and seismic velocity variations affect the path of seismic waves. From a mathematical point of view, this inference is an inverse problem. To complicate things, geophysical inverse problems are usually ill-posed, meaning that a solution:
These problems can usually be resolved through least-squares estimation and regularization.
Research in geophysical inverse problems involves the development of: new methodologies for parametrization, different approaches to regularization, new algorithms to handle large-scale problems, combinations of existing methods, etc. All of the aforementioned developments require the creation of software, usually from scratch. Furthermore, most scientific software are not designed with reuse in mind, making remixing published methods difficult, if not impossible.
We tackled these problems
by developing fatiando.inversion
,
a framework for solving inverse problems
in Fatiando a Terra.
The goals of fatiando.inversion
are:
In this talk, I'll briefly cover the mathematics involved and the design of our new API. I'll walk through the process of implementing a new inverse problem (in about 30 lines of code) using the example of estimating the relief of a sedimentary basin from its gravity anomaly. Finally, I'll conclude by outlining how we are using this framework in our own research, what we are currently working on, and our plans for the future.
As a bonus, I made this gif for the Twitter hashtag #scipy2014:
.@kwinkunks @AdventureMomo @_row1 how about this? (code is here https://t.co/9T3J27p0CG) #scipy2014 pic.twitter.com/5ryXw0L66X
— Leonardo Uieda (@leouieda) July 9, 2014
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>EGU abstract ID: EGU2014-10898-1
Satellite observations of the gravity field have provided geophysicists with exceptionally dense and uniform coverage of data over vast areas. This enables regional or global scale high resolution geophysical investigations. Techniques like forward modeling and inversion of gravity anomalies are routinely used to investigate large geologic structures, such as large igneous provinces, suture zones, intracratonic basins, and the Moho. Accurately modeling such large structures requires taking the sphericity of the Earth into account. A reasonable approximation is to assume a spherical Earth and use spherical coordinates.
In recent years, efforts have been made to advance forward modeling in spherical coordinates using tesseroids, particularly with respect to speed and accuracy. Conversely, traditional space domain inverse modeling methods have not yet been adapted to use spherical coordinates and tesseroids. In the literature there are a range of inversion methods that have been developed for Cartesian coordinates and right rectangular prisms. These include methods for estimating the relief of an interface, like the Moho or the basement of a sedimentary basin. Another category includes methods to estimate the density distribution in a medium. The latter apply many algorithms to solve the inverse problem, ranging from analytic solutions to random search methods as well as systematic search methods.
We present an adaptation for tesseroids of the systematic search method of "planting anomalous densities". This method can be used to estimate the geometry of geologic structures. As prior information, it requires knowledge of the approximate densities and positions of the structures. The main advantage of this method is its computational efficiency, requiring little computer memory and processing time. We demonstrate the shortcomings and capabilities of this approach using applications to synthetic and field data. Performing the inversion of gravity and gravity gradient data, simultaneously or separately, is straight forward and requires no changes to the existing algorithm. Such feature makes it ideal for inverting the multicomponent gravity gradient data from the GOCE satellite.
An implementation of our adaptation is freely available in the open-source modeling and inversion package Fatiando a Terra (http://www.fatiando.org).
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This article is part of the Geophysical Tutorials section in The Leading Edge, started by Matt Hall of Agile Geoscience. All tutorials are Open-Access and include open-source code examples. Read the February 2016 tutorial by Matt for an introduction to the tutorial series and what you need to know to get started running the code in them.
This article uses the Euler deconvolution implemented in Fatiando a Terra, an open-source Python library. See the repository pinga-lab/paper-tle-euler-tutorial for the source code that accompanies the article and extra material.
In this tutorial we'll talk about a widely used method of interpretation for potential-field data called Euler deconvolution. Our goal is to demonstrate its usefulness and, most importantly, call attention to some pitfalls encountered in the interpretation of the results. The code and synthetic data required to reproduce our results and figures can be found in the accompanying IPython notebooks (ipython.org/notebook) at dx.doi.org/10.6084/m9.figshare.923450 or github.com/pinga-lab/paper-tle-euler-tutorial. The notebooks also expand the analysis presented here. We encourage you to download the data and try it on your software of choice. For this tutorial we'll use the implementation in the open-source Python package Fatiando a Terra (fatiando.org).
@article{uieda2014, title = {Geophysical tutorial: {Euler} deconvolution of potential-field data}, volume = {33}, issn = {1070-485X, 1938-3789}, doi = {10.1190/tle33040448.1}, number = {4}, journal = {The Leading Edge}, author = {Uieda, Leonardo and Oliveira Jr., Vanderlei C. and Barbosa, Valéria C. F.}, month = apr, year = {2014}, pages = {448--450} }
Uieda, L., V. C. Oliveira Jr, and V. C. F. Barbosa (2014), Geophysical tutorial: Euler deconvolution of potential-field data, The Leading Edge, 33(4), 448-450, doi:10.1190/tle33040448.1
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>We have developed a new method that drastically reduces the number of the source location estimates in Euler deconvolution to only one per anomaly. Our method employs the analytical estimators of the base level and of the horizontal and vertical source positions in Euler deconvolution as a function of the x- and y-coordinates of the observations. By assuming any tentative structural index (defining the geometry of the sources), our method automatically locates plateaus, on the maps of the horizontal coordinate estimates, indicating consistent estimates that are very close to the true corresponding coordinates. These plateaus are located in the neighborhood of the highest values of the anomaly and show a contrasting behavior with those estimates that form inclined planes at the anomaly borders. The plateaus are automatically located on the maps of the horizontal coordinate estimates by fitting a first-degree polynomial to these estimates in a moving-window scheme spanning all estimates. The positions where the angular coefficient estimates are closest to zero identify the plateaus of the horizontal coordinate estimates. The sample means of these horizontal coordinate estimates are the best horizontal location estimates. After mapping each plateau, our method takes as the best structural index the one that yields the minimum correlation between the total-field anomaly and the estimated base level over each plateau. By using the estimated structural index for each plateau, our approach extracts the vertical coordinate estimates over the corresponding plateau. The sample means of these estimates are the best depth location estimates in our method. When applied to synthetic data, our method yielded good results if the bodies produce weak- and mid-interfering anomalies. A test on real data over intrusions in the Goiás Alkaline Province, Brazil, retrieved sphere-like sources suggesting 3D bodies.
@article{melo2013, title = {Estimating the nature and the horizontal and vertical positions of 3D magnetic sources using {Euler} deconvolution}, volume = {78}, issn = {0016-8033, 1942-2156}, doi = {10.1190/geo2012-0515.1}, number = {6}, journal = {GEOPHYSICS}, author = {Melo, Felipe F. and Barbosa, Valeria C. F. and Uieda, Leonardo and Oliveira, Vanderlei C. and Silva, João B. C.}, month = oct, year = {2013}, pages = {J87--J98} }
Melo, F. F., V. C. F. Barbosa, L. Uieda, V. C. Oliveira Jr, and J. B. C. Silva (2013), Estimating the nature and the horizontal and vertical positions of 3D magnetic sources using Euler deconvolution, Geophysics, 78(6), J87-J98, doi:10.1190/geo2012-0515.1
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>Conference proceedings site: http://conference.scipy.org/proceedings/scipy2013/uieda.html
Solid Earth geophysics is the science of using physical observations of the Earth to infer its inner structure. Generally, this is done with a variety of numerical modeling techniques and inverse problems. The development of new algorithms usually involves copy and pasting of code, which leads to errors and poor code reuse. Added to this is a modeling pipeline composed of various tools that don't communicate with each other (Fortran/C for computations, large complicated I/O files, Matlab/VTK for visualization, etc). Fatiando a Terra is a Python library that aims to unify the modeling pipeline inside of the Python language. This allows users to replace the traditional shell scripting with more versatile and powerful Python scripting. Together with the new IPython notebook, Fatiando a Terra can integrate all stages of the geophysical modeling process, like data pre-processing, inversion, statistical analysis, and visualization. However, the library can also be used for quickly developing stand-alone programs that can be integrated into existing pipelines. Plus, because functions inside Fatiando a Terra use a common data and mesh format, existing algorithms can be combined and new ideas can build upon existing functionality. This flexibility facilitates reproducible computations, prototyping of new algorithms, and interactive teaching exercises. Although the project has so far focused on potential field methods (gravity and magnetics), some numerical tools for other geophysical methods have been developed as well. The library already contains: fast implementations of forward modeling algorithms (using Numpy and Cython), generic inverse problem solvers, unified geometry classes (prism meshes, polygons, etc), functions to automate repetitive plotting tasks with Matplotlib (automatic griding, simple GUIs, picking, projections, etc) and Mayavi (automatic conversion of geometry classes to VTK, drawing continents, etc). In the future, we plan to continuously implement classic and state-of-the-art algorithms as well as sample problems to help teach geophysics.
Uieda, L., V. C. Oliveira Jr and V. C. F. Barbosa (2013), Modeling the Earth with Fatiando a Terra, Proceedings of the 12th Python in Science Conference, pp. 90-96
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This talk presents an adaptation of the gravity-gradient inversion method I developed for my Masters degree dissertation "Robust 3D gravity gradient inversion by planting anomalous densities" to invert magnetic data.
As you may have noticed, there is an error in the title. We do not, in fact, invert magnetic data using density anomalies. This illustrates the perils of copy-pasting combined with a looming deadline.
The inversion method was developed along a few iterations and presented at different meetings (in order):
I have added an open-source implementation of the gravity-gradient
inversion method to the Python library Fatiando a
Terra. In version 0.1
to 0.4
, the code is in
module fatiando.gravmag.harvester
.
AGU abstract ID: GP22A-01
We present a new 3D magnetic inversion algorithm based on the computationally efficient method of planting anomalous densities. The algorithm consists of an iterative growth of the anomalous bodies around prismatic elements called "seeds". These seeds are user-specified and have known magnetizations. Thus, the seeds provide a way for the interpreter to specify the desired skeleton of the anomalous bodies. The inversion algorithm is computationally efficient due to various optimizations made possible by the iterative nature of the growth process. The control provided by the use of seeds allows one to test different hypothesis about the geometry and magnetization of targeted anomalous bodies. To demonstrate this capability, we applied our inversion method to the Morro do Engenho (ME) and A2 magnetic anomalies, central Brazil (Figure 1a). ME is an outcropping alkaline intrusion formed by dunites, peridotites and pyroxenites with known magnetization. A2 is a magnetic anomaly to the Northeast of ME and is thought to be a similar intrusion that is not outcropping. Therefore, a plausible hypothesis is that A2 has the same magnetization as ME. We tested this hypothesis by performing an inversion using a single seed for each body. Both seeds had the same magnetization. Figure 1b shows that the inversion produced residuals up to 2000 nT over A2 (i.e., a poor fit) and less than 400 nT over ME (i.e., an acceptable fit). Figure 1c shows that ME is a compact outcropping body with bottom at approximately 5 km, which is in agreement with previous interpretations. However, the estimate produced by the inversion for A2 is outcropping and is not compact. In summary, the estimate for A2 provides a poor fit to the observations and is not in accordance with the geologic information. This leads to the conclusion that A2 does not have the same magnetization as ME. These results indicate the usefulness and capabilities of the inversion method here proposed.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>The Polynomial Equivalent Layer (PEL) is implemented in the open-source Python
library Fatiando a Terra.
There are two classes for running the PEL in module fatiando.gravmag.eqlayer
:
PELGravity
for fitting gravitational field components and
PELTotalField
for fitting the total field magnetic anomaly.
Both classes were introduced in
version 0.2
of the library.
We have developed a new cost-effective method for processing large-potential-field data sets via the equivalent-layer technique. In this approach, the equivalent layer is divided into a regular grid of equivalent-source windows. Inside each window, the physical-property distribution is described by a bivariate polynomial. Hence, the physical-property distribution within the equivalent layer is assumed to be a piecewise polynomial function defined on a set of equivalent-source windows. We perform any linear transformation of a large set of data as follows. First, we estimate the polynomial coefficients of all equivalent-source windows by using a linear regularized inversion. Second, we transform the estimated polynomial coefficients of all windows into the physical-property distribution within the whole equivalent layer. Finally, we premultiply this distribution by the matrix of Green's functions associated with the desired transformation to obtain the transformed data. The regularized inversion deals with a linear system of equations with dimensions based on the total number of polynomial coefficients within all equivalent-source windows. This contrasts with the classical approach of directly estimating the physical-property distribution within the equivalent layer, which leads to a system based on the number of data. Because the number of data is much larger than the number of polynomial coefficients, the proposed polynomial representation of the physical-property distribution within an equivalent layer drastically reduces the number of parameters to be estimated. By comparing the total number of floating-point operations required to estimate an equivalent layer via our method with the classical approach, both formulated with Cholesky's decomposition, we can verify that the computation time required for building the linear system and for solving the linear inverse problem can be reduced by as many as three and four orders of magnitude, respectively. Applications to synthetic and real data show that our method performs the standard linear transformations of potential-field data accurately.
@article{oliveira2013, title = {Polynomial equivalent layer}, volume = {78}, issn = {0016-8033, 1942-2156}, doi = {10.1190/geo2012-0196.1}, number = {1}, journal = {GEOPHYSICS}, author = {Oliveira Jr., Vanderlei C. and Barbosa, Valéria C. F. and Uieda, Leonardo}, month = jan, year = {2013}, pages = {G1--G13}, }
Oliveira Jr, V. C., V. C. F. Barbosa, and L. Uieda (2013), Polynomial equivalent layer, Geophysics, 78(1), G1–G13, doi:10.1190/geo2012-0196.1
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>We have interpreted the airborne gravity gradiometry data from Carajás Mineral Province (CMP), Brazil, by using two different 3D inversion methods. Both inversion methods parameterized the Earth's subsurface into prismatic cells and estimate the 3D density-contrast distribution that retrieves an image of geologic sources subject to an acceptable data misfit. The first inversion method imposes smoothness on the solution by solving a linear system that minimizes an depth weighted L2 model objective function of density-contrast distribution. The second imposes compactness on the solution by using an iterative growth algorithm solved by a systematic search algorithm that accretes mass around user-specified prisms called “seeds”. Using these two inversion methods, the interpretation of full tensor gravity gradiometry data from an iron ore deposit in the area named N1 at CMP shows the consistent geometry and the depth of iron orebody. To date, the maximum depth of the iron orebody is assumed to be 200 m based on the maximum depth attained by the deepest well drilled in this study area. However, both inversion results exhibit a source whose maximum bottom depth is greater than 200 m. These results give rise two interpretations: i) the iron orebody may present its depth to the bottom greater than the maximum depth of 200 m attained by the deepest borehole; or ii) the iron orebody may be 200 m deep and the rocks below may be jaspilite whose density is close to that of soft hematite.
Carlos, D. U., L. Uieda, Y. Li, V. C. F. Barbosa, M. A. Braga, G. Angeli, and G. Peres (2012), Iron ore interpretation using gravity-gradient inversions in the Carajás, Brazil, SEG Technical Program Expanded Abstracts, pp. 2008–2012, doi:10.1190/segam2012-0525.1
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This talk and expanded abstract present an improvement over the method I developed for my Masters degree dissertation "Robust 3D gravity gradient inversion by planting anomalous densities". The inversion method was developed along a few iterations and presented at different meetings (in order):
I have added an open-source implementation of the method to the Python
library Fatiando a Terra. In version 0.1
to 0.4
,
the code is in module fatiando.gravmag.harvester
.
We present an improvement to the method of 3D gravity gradient inversion by planting anomalous densities. This method estimates a density-contrast distribution defined on a grid of right-rectangular prisms. Instead of solving large equation systems, the method uses a systematic search algorithm to grow the solution, one prism at a time, around user-specified prisms called "seeds". These seeds have known density contrasts and the solution is constrained to be concentrated around the seeds as well as have their density contrasts. Thus, prior geologic and geophysical information are incorporated into the inverse problem through the seeds. However, this leads to a strong dependence of the solution on the correct location, density contrast, and number of seeds used. Our improvement to this method consists of using the "shape-of-anomaly" data-misfit function in conjunction with the l2-norm data-misfit function. The shape-of-anomaly function measures the different in shape between the observed and predicted data and is insensitive to differences in amplitude. Tests on synthetic and real data show that the improved method not only has an increased robustness with respect to the number of seeds and their locations, but also provides a better fit of the observed data.
Uieda, L., and V. C. F. Barbosa (2012), Use of the "shape-of-anomaly" data misfit in 3D inversion by planting anomalous densities, SEG Technical Program Expanded Abstracts, pp. 1-6, doi:10.1190/segam2012-0383.1
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>Forward modeling of potential fields is a useful way to incorporate the interpreter's knowledge about the geology of the interpretation area into the model. However, this can be a very tedious task. This is specially true when modeling in 3D and trying to fit multiple components, e.g., in gravity gradiometry. The interpreter is required to simultaneously supervise the data fit and the construction of geologically realistic 3D bodies. This problem is partially solved by methods of geophysical inversion, which automatically fit the data. Conversely, inverse problems introduce other challenges of their own. Most geophysical inverse problems are ill-posed because their solutions are neither unique nor stable. Thus, they require the introduction of prior information, usually through regularizing functions. Moreover, 3D inverse problems are very computationally expensive. Recent developments in potential field inversion have proposed different regularizing functions to transform the ill-posed problem into a well-posed one. Also, several techniques, like data compression and parallel computation, have been applied to overcome the computational complexity. We call attention to the method of potential field inversion by planting anomalous densities. This method uses an iterative algorithm to automatically grow the anomalous bodies around user-specified prismatic elements called "seeds", which have fixed density contrasts and positions. These seeds provide a first estimate of the skeletal outlines of the presumed anomalous bodies. Then, the inversion iteratively concentrates mass around this "skeleton" in a way that both fits the observed data and yields compact bodies. Therefore, the interpreter can easily impose prior information on the inversion through the seeds. The interpreter needs only to supply a few seeds that specify the sources' skeleton, eliminating the exhaustive task of specifying the complete geometry of multiple sources. Moreover, the interpreter is liberated from the time- consuming procedure of yielding a reasonable fit to the data. Due to its high computational efficiency, the method of planting anomalous densities can be used to quickly test geologic hypothesis of different locations and density contrasts for presumed sources. To test a hypothesis, one would choose the locations and density contrasts of the seeds accordingly and verify if the inversion result is able to fit the observed data. If it is not able, then the hypothesis can be rejected and a new one can be formulated and tested. Otherwise, there is no reason to reject the hypothesis on the basis of the geophysical data. Thus, the method can be viewed as a an enhanced forward modeling. The method of planting anomalous densities can be used with both gravity and gravity gradient data. This makes it an ideal tool to interpret compact geologic bodies using the new generation GOCE data. We present applications to synthetic and real data that illustrate the usefulness of our method.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This was my first publication and the topic of my Masters dissertation. The inversion method was developed along successive iterations and presented in meetings at each step (in order):
The inversion method proposed in this paper is implemented in the open-source
Fatiando a Terra Python library
as the fatiando.gravmag.harvester
module.
The module was introduced in
version 0.1
of the library.
The following is an animation of the growth algorithm during the inversion of synthetic data. The video is available at figshare: 10.6084/m9.figshare.91469
We have developed a new gravity gradient inversion method for estimating a 3D density-contrast distribution defined on a grid of rectangular prisms. Our method consists of an iterative algorithm that does not require the solution of an equation system. Instead, the solution grows systematically around user-specified prismatic elements, called “seeds,” with given density contrasts. Each seed can be assigned a different density-contrast value, allowing the interpretation of multiple sources with different density contrasts and that produce interfering signals. In real world scenarios, some sources might not be targeted for the interpretation. Thus, we developed a robust procedure that neither requires the isolation of the signal of the targeted sources prior to the inversion nor requires substantial prior information about the nontargeted sources. In our iterative algorithm, the estimated sources grow by the accretion of prisms in the periphery of the current estimate. In addition, only the columns of the sensitivity matrix corresponding to the prisms in the periphery of the current estimate are needed for the computations. Therefore, the individual columns of the sensitivity matrix can be calculated on demand and deleted after an accretion takes place, greatly reducing the demand for computer memory and processing time. Tests on synthetic data show the ability of our method to correctly recover the geometry of the targeted sources, even when interfering signals produced by nontargeted sources are present. Inverting the data from an airborne gravity gradiometry survey flown over the iron ore province of Quadrilátero Ferrífero, southeastern Brazil, we estimated a compact iron ore body that is in agreement with geologic information and previous interpretations.
@article{uieda2012, title = {Robust 3D gravity gradient inversion by planting anomalous densities}, volume = {77}, issn = {00168033}, doi = {10.1190/geo2011-0388.1}, number = {4}, journal = {Geophysics}, author = {Uieda, Leonardo and Barbosa, Valéria C. F.}, year = {2012}, pages = {G55--G66}, }
Uieda, L., and V. C. F. Barbosa (2012), Robust 3D gravity gradient inversion by planting anomalous densities, Geophysics, 77(4), G55-G66, doi:10.1190/geo2011-0388.1
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This talk and expanded abstract present the second version of what would be my first publication "Robust 3D gravity gradient inversion by planting anomalous densities" and eventually Masters dissertation. The inversion method was developed along a few iterations and presented at different meetings (in order):
I have added an open-source implementation of the method to the Python
library Fatiando a Terra. In version 0.1
to 0.4
,
the code is in module fatiando.gravmag.harvester
.
We present a new gravity gradient inversion method for estimating a 3D density-contrast distribution defined on a grid of prisms. Our method consists of an iterative algorithm that does not require the solution of a large equation system. Instead, the solution grows systematically around user-specified prismatic elements called "seeds". Each seed can be assigned a different density contrast, allowing the interpretation of multiple bodies with different density contrasts and that produce interfering gravitational effects. The compactness of the solution around the seeds is imposed by means of a regularizing function. The solution grows by the accretion of neighboring prisms of the current solution. The prisms for the accretion are chosen by systematically searching the set of current neighboring prisms. Therefore, this approach allows that the columns of the Jacobian matrix be calculated on demand, which greatly reduces the demand of computer memory and processing time. Tests on synthetic data and on real data collected over an iron ore province of Quadrilátero Ferrífero, southeastern Brazil, confirmed the ability of our method in detecting sharp and compact bodies.
Uieda, L., and V. C. F. Barbosa (2011), Robust 3D gravity gradient inversion by planting anomalous densities, SEG Technical Program Expanded Abstracts, vol. 30, pp. 820–824, doi:10.1190/1.3628201
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This talk and expanded abstract are a branch of my Masters degree dissertation "Robust 3D gravity gradient inversion by planting anomalous densities". It presents an adaptation of the gravity-gradient inversion to gravity data. The inversion method was developed along a few iterations and presented at different meetings (in order):
I have added an open-source implementation of the method to the Python
library Fatiando a Terra. In version 0.1
to 0.4
,
the code is in module fatiando.gravmag.harvester
.
This paper presents a novel gravity inversion method for estimating a 3D density-contrast distribution defined on a grid of prisms. Our method consists of an iterative algorithm that does not require the solution of a large equation system. Instead, the solution grows systematically around user-specified prismatic elements called "seeds". Each seed can have a different density contrast, allowing the interpretation of multiple bodies with different density contrasts and interfering gravitational effects. The compactness of the solution around the seeds is imposed by means of a regularizing function. The solution grows by the accretion of neighboring prisms of the current solution. The prisms for the accretion are chosen by systematically searching the set of current neighboring prisms. Therefore, this approach allows that the columns of the Jacobian matrix be calculated on demand. This is a known technique from computer science called "lazy evaluation", which greatly reduces the demand of computer memory and processing time. Test on synthetic data and on real data collected over the ultramafic Cana Brava complex, central Brazil, confirmed the ability of our method in detecting sharp and compact bodies.
Uieda, L., and V. C. F. Barbosa (2011), 3D gravity inversion by planting anomalous densities, SBGf 2011 Expanded Abstracts, pp. 1–5, doi:10.1190/sbgf2011-179
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This poster and expanded abstract present the first version of what would be my first publication "Robust 3D gravity gradient inversion by planting anomalous densities" and eventually Masters dissertation. The inversion method was developed along a few iterations after this (in order):
I have added an open-source implementation of the method to the Python
library Fatiando a Terra. In version 0.1
to 0.4
,
the code is in module fatiando.gravmag.harvester
.
We present a new gravity gradient tensor inversion for estimating a 3D density-contrast distribution defined on a user-specified grid of prisms. Our method consists of an iterative algorithm that does not require the solution of large equation system. Instead, the solution grows systematically around user-specified prismatic elements called “seeds”. Each seed can have a different density contrast, allowing the interpretation of multiples bodies with different density contrasts. The compactness of the solution is imposed by means of a regularizing function that favors compact bodies closest to the priorly specified seeds. The solution grows by accreting neighboring prisms of the current solution. The prisms for the accretion are chosen by systematically searching the set of current neighboring prisms. Therefore, this approach allows that the columns of the Jacobian matrix be calculated on demand. This is a known technique from computer science called “lazy evaluation”, which greatly reduces the demand of computer memory and processing time. Test on synthetic data from multiple buried sources at different depths and on real data collected over iron deposits located in the Quadrilátero Ferrífero, southeastern region of Brazil, confirmed the ability of our method in detecting sharp and compact bodies.
Uieda, L., and V. C. F. Barbosa (2011), 3D gravity gradient inversion by planting density anomalies, 73th EAGE Conference & Exhibition incorporating SPE EUROPEC, pp. 1-5
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This poster and conference proceedings present the results and methods after the 1.0 release of my open-source software Tesseroids: gravity forward modeling in spherical coordinates. Version 1.0 was a complete re-write of the original Python code in the C language. This work was made possible by professor Carla Braitenberg. She funded me to spend a month at the University of Trieste, Italy, and re-write the software from scratch. What followed was a much faster and more robust program. This version also featured the first iteration of the adaptive discretization presented in the paper Tesseroids: forward modeling gravitational fields in spherical coordinates.
Other presentations about Tesseroids:
The new observations of GOCE present a challenge to develop new calculation methods that take into account the sphericity of the Earth. We address this problem by using a discretization with a series of tesseroids. There is no closed formula giving the gravitational fields of the tesseroid and numerical integration methods must be used, such as the Gauss Legendre Cubature (GLC). A problem that arises is that the computation times with the tesseroids are high. Therefore, it is important to optimize the computations while maintaining the desired accuracy. This optimization was done using an adaptive computation scheme that consists of using a fixed GLC order and recursively subdividing the tesseroids. We have obtained an optimum ratio between the size of the tesseroid and its distance from the computation point. Furthermore, we show that this size-to-distance ratio is different for the gravitational attraction than for the gravity gradient tensor.
Uieda, L., E. P. Bomfim, C. Braitenberg, and E. Molina (2011), Optimal forward calculation method of the Marussi tensor due to a geologic structure at GOCE height, Proc. of 4th International GOCE User Workshop, pp. 1–5
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This is a presentation of the methods behind the open-source software Tesseroids: gravity forward modeling in spherical coordinates. The algorithms implemented in the software have since been updated (see the paper Tesseroids: forward modeling gravitational fields in spherical coordinates) and have become a part of my PhD thesis. The content of this presentation is a summary of my Bachelor's degree thesis.
Other presentations about Tesseroids:
AGU abstract ID: Abstract G22A–04
The GOCE satellite mission has the objective of measuring the Earth's gravitational field with an unprecedented accuracy through the measurement of the gravity gradient tensor (GGT). One of the several applications of this new gravity data set is to study the geodynamics of the lithospheric plates, where the flat Earth approximation may not be ideal and the Earth's curvature should be taken into account. In such a case, the Earth could be modeled using tesseroids, also called spherical prisms, instead of the conventional rectangular prisms. The GGT due to a tesseroid is calculated using numerical integration methods, such as the Gauss-Legendre Quadrature (GLQ), as already proposed by Asgharzadeh et al. (2007) and Wild-Pfeiffer (2008). We present a computer program for the direct computation of the GGT caused by a tesseroid using the GLQ. The accuracy of this implementation was evaluated by comparing its results with the result of analytical formulas for the special case of a spherical cap with computation point located at one of the poles. The GGT due to the topographic masses of the Parana basin (SE Brazil) was estimated at 260km altitude in an attempt to quantify this effect on the GOCE gravity data. The digital elevation model ETOPO1 (Amante and Eakins, 2009) between 40º W and 65º W and 10º S and 35º S, which includes the Paraná Basin, was used to generate a tesseroid model of the topography with grid spacing of 10' x 10' and a constant density of 2670 kg/m3. The largest amplitude observed was on the second vertical derivative component (-0.05 to 1.20 Eötvos) in regions of rough topography, such as that along the eastern Brazilian continental margins. These results indicate that the GGT due to topographic masses may have amplitudes of the same order of magnitude as the GGT due to density anomalies within the crust and mantle.
References
Amante, C., Eakins, B.W., 2009. ETOPO1 1 Arc-Minute Global Relief Model: Procedures, Data Sources and Analysis. NOAA Technical Memorandum NESDIS NGDC-24, p. 19.
Asgharzadeh, M.F.; Von Frese, R.R.B.; Kim, H.R.; Leftwich, T.E.; Kim, J.W., 2007. Spherical prism gravity effects by Gauss-Legendre quadrature integration. Geophysics Journal International, v. 169, p. 1 - 11.
Wild-Pfeiffer, F., 2008. A comparison of different mass elements for use in gravity gradiometry. Journal of Geodesy, v. 82 (10), p. 637 - 653.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This is the first poster I presented about my undergraduate research on forward modeling with tesseroids (spherical prisms). This would eventually become the software Tesseroids, the paper Tesseroids: forward modeling gravitational fields in spherical coordinates, and a part of my PhD thesis.
Other presentations about Tesseroids:
A ESA (European Space Agency) planeja lançar no outono de 2008 a missão GOCE (Gravity field and steady-state Ocean Circulation Explorer). A missão foi planejada para medir o campo gravitacional da Terra com acurácia e resolução sem precedentes. Para isso, fará uso de um gradiômetro de gravidade eletrostático que consiste de três pares de acelerômetros idênticos mutuamente ortogonais. O GOCE fornecerá dados do tensor gradiente da gravidade (TGG) a uma altitude de órbita de aproximadamente 250 km.
Está sendo desenvolvido um programa computacional para analisar dados do TGG sobre as bacias sedimentares brasileiras. O programa utilizará o método da Quadratura Gauss-Legendre para efetuar a modelagem direta do TGG gerado por feições ou corpos geológicos com geometria esférica. A modelagem será feita discretizando o corpo por tesseróides, também denominados prismas esféricos. Os tesseróides são segmentos de uma casca esférica de espessura finita limitados por linhas de grade geográficas. A geometria dos tesseróides possibilita a construção de modelos levando em conta a curvatura da Terra. Isto se torna importante na modelagem de corpos geológicos com grande extensão lateral, como por exemplo, a bacia do Paraná. Será criado um modelo de densidade desta bacia a partir de dados de poços e dados sísmicos e utilizaremos o programa desenvolvido para obter estimativas do TGG. As estimativas serão comparadas com os futuros dados do GOCE na tentativa de separar o componente gravimétrico associado às variações de densidade na parte mais profunda da bacia.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>This is the first poster I ever made. It was about my first undergraduate research project at the paleomagnetism lab at the Universidade de São Paulo. The project lasted for a year and I was able to go to the field and collect samples from Cambrian dikes.
I can't even find an abstract for this but I like to share it anyway. It's a good way to see my progress and kind of nostalgic.
Comments? Leave one below or let me know on Twitter @leouieda or in the Software Underground Slack group.
Found a typo/mistake? Send a fix through Github and I'll happily merge it (plus you'll feel great because you helped someone). All you need is an account and 5 minutes!
]]>