Preprints
Giving software its due through community-driven review and publication
2019. Barba, LA, Bazan, J, Brown, J, Guimera, RV, Gymrek, M, Alex Hanna, Heagy, LJ, Huff, KD, Katz, DS, Madan, CR, Moerman, KM, Niemeyer, KE, Poulson, JL, Prins, P, Ram, K, Rokem, A, Smith, AM, Thiruvathukal, GK, Thyng, KM, Uieda, L, Wilson, BE, Yehudi, Y
Note: This correspondence was written by the editorial board of the Journal of Open Source Software in response to the Nature Methods editorial "Giving Software its Due". It was not accepted as a comment on the editorial so we published it as a preprint instead.
Abstract
A recent editorial in Nature Methods, "Giving Software its Due", described challenges related to the development of research software and highlighted, in particular, the challenge of software publication and citation. Here, we call attention to a system that we have developed that enables community-driven software review, publication, and citation: The Journal of Open Source Software (JOSS) is an open-source project and an open access journal that provides a light-weight publishing process for research software. Focused on and based in open platforms and on a community of contributors, JOSS evidently satisfies a pressing need, having already published more than 500 articles in approximately three years of existence.
Citations
Papers
Gradient-boosted equivalent sources
2021. Soler, SR, Uieda, L
Note: This research was done entirely with open-source software and open data! This means that anyone should be able to fully reproduce our results using the information in the paper and the material in the associated GitHub repository. This is the final part of Santiago's PhD thesis.
Abstract
The equivalent source technique is a powerful and widely used method for
processing gravity and magnetic data. Nevertheless, its major drawback is
the large computational cost in terms of processing time and computer
memory. We present two techniques for reducing the computational cost of
equivalent source processing: block-averaging source locations and the
gradient-boosted equivalent source algorithm. Through block-averaging,
we reduce the number of source coefficients that must be estimated
while retaining the minimum desired resolution in the final processed
data. With the gradient boosting method, we estimate the sources
coefficients in small batches along overlapping windows, allowing us to
reduce the computer memory requirements arbitrarily to conform to the
constraints of the available hardware. We show that the combination of
block-averaging and gradient-boosted equivalent sources is capable of
producing accurate interpolations through tests against synthetic data.
Moreover, we demonstrate the feasibility of our method by gridding a
gravity dataset covering Australia with over 1.7 million observations
using a modest personal computer.
Citations
Pooch: A friend to fetch your data files
2020. Uieda, L, Soler, SR, Rampin, R, van Kemenade, H, Turk, M, Shapero, D, Banihirwe, A, Leeman, J
Note: This paper marks the release of Pooch v0.7.1, a Python library for downloading and managing data files. Pooch is a part of the new ecosystem of packages in Fatiando a Terra. The peer-review at JOSS is open and can be found on GitHub issue openjournals/joss-reviews#1943.
Abstract
Scientific software is usually created to acquire, analyze, model, and visualize data. As such, many software libraries include sample datasets in their distributions for use in documentation, tests, benchmarks, and workshops. A common approach is to include smaller datasets in the GitHub repository directly and package them with the source and binary distributions (e.g., scikit-learn and scikit-image do this). As data files increase in size, it becomes unfeasible to store them in GitHub repositories. Thus, larger datasets require writing code to download the files from a remote server to the user's computer. The same problem is faced by scientists using version control to manage their research projects. While downloading a data file over HTTPS can be done easily with modern Python libraries, it is not trivial to manage a set of files, keep them updated, and check for corruption. For example, scikit-learn, Cartopy, and PyVista all include code dedicated to this particular task. Instead of scientists and library authors recreating the same code, it would be best to have a minimalistic and easy to set up tool for fetching and maintaining data files. Pooch is a Python library that fills this gap. It manages a data registry (containing file names, SHA-256 cryptographic hashes, and download URLs) by downloading files from one or more remote servers and storing them in a local data cache. Pooch is written in pure Python and has minimal dependencies. It can be easily installed from the Python Package Index (PyPI) and conda-forge on a wide range of Python versions: 2.7 (up to Pooch 0.6.0) and from 3.5 to 3.8.
Citations
The Generic Mapping Tools Version 6
2019. Wessel, P, Luis, J, Uieda, L, Scharroo, R, Wobbe, F, Smith, WHF, Tian, D
Note: This paper marks the release of GMT version 6. Most of the work done for this release had the goal of reducing barriers to entry for new users. The user experience as a whole has been improved and these changes are the foundation for my work on PyGMT. The development of the new modern mode was funded by our NSF EarthScope grant.
Abstract
The Generic Mapping Tools (GMT) software is ubiquitous in the Earth and Ocean sciences. As a cross-platform tool producing high quality maps and figures, it is used by tens of thousands of scientists around the world. The basic syntax of GMT scripts has evolved very slowly since the 1990s, despite the fact that GMT is generally perceived to have a steep learning curve with many pitfalls for beginners and experienced users alike. Reducing these pitfalls means changing the interface, which would break compatibility with thousands of existing scripts. With the latest GMT version 6, we solve this conundrum by introducing a new "modern mode" to complement the interface used in previous versions, which GMT 6 now calls "classic mode". GMT 6 defaults to classic mode and thus is a recommended upgrade for all GMT 5 users. Nonetheless, new users should take advantage of modern mode to make shorter scripts, quickly access commonly used global data sets, and take full advantage of the new tools to draw subplots, place insets, and create animations.
Citations
Gravitational field calculation in spherical coordinates using variable densities in depth
2019. Soler, SR, Pesce, A, Gimenez, ME, Uieda, L
Note: This paper builds upon my work on Tesseroids and extends the methodology to work for depth-variable densities. Santiago led this project, did most of the work and a large part of the writing of the paper. This is the first paper of his PhD thesis.
Abstract
We present a new methodology to compute the gravitational fields generated by tesseroids (spherical prisms) whose density varies with depth according to an arbitrary continuous function. It approximates the gravitational fields through the Gauss-Legendre Quadrature along with two discretization algorithms that automatically control its accuracy by adaptively dividing the tesseroid into smaller ones. The first one is a preexisting two dimensional adaptive discretization algorithm that reduces the errors due to the distance between the tesseroid and the computation point. The second is a new density-based discretization algorithm that decreases the errors introduced by the variation of the density function with depth. The amount of divisions made by each algorithm is indirectly controlled by two parameters: the distance-size ratio and the delta ratio. We have obtained analytical solutions for a spherical shell with radially variable density and compared them to the results of the numerical model for linear, exponential, and sinusoidal density functions. These comparisons allowed us to obtain optimal values for the distance-size and delta ratios that yield an accuracy of 0.1% of the analytical solutions. The resulting optimal values of distance-size ratio for the gravitational potential and its gradient are 1 and 2.5, respectively. The density-based discretization algorithm produces no discretizations in the linear density case, but a delta ratio of 0.1 is needed for the exponential and the sinusoidal density functions. These values can be extrapolated to cover most common use cases. However, the distance-size and delta ratios can be configured by the user to increase the accuracy of the results at the expense of computational speed. Lastly, we apply this new methodology to model the Neuquén Basin, a foreland basin in Argentina with a maximum depth of over 5000 m, using an exponential density function.
Citations
Efficient 3D large-scale forward-modeling and inversion of gravitational fields in spherical coordinates with application to lunar mascons
2019. Zhao, G, Chen, B, Uieda, L, Liu, J, Kaban, MK, Chen, L, Guo, R
Note: This new collaboration that came about in an unexpected way. Leo reviewed an initial version of this paper and it ended up being rejected by the journal. After which, the authors reached out and kindly asked if he wanted to help improve the paper further. We worked on this for the better part of a year, adding the inversion and lunar mascon application.
Abstract
An efficient forward modeling algorithm for calculation of gravitational fields in spherical coordinates is developed for 3D large‐scale gravity inversion problems. 3D Gauss‐Legendre quadrature (GLQ) is used to calculate the gravitational fields of mass distributions discretized into tesseroids. Equivalence relations in the kernel matrix of the forward‐modeling are exploited to decrease storage and computation time. The numerical tests demonstrate that the computation time of the proposed algorithm is reduced by approximately two orders of magnitude, and the memory requirement is reduced by N'λ times compared with the traditional GLQ method, where N'λ is the number of the model elements in the longitudinal direction. These significant improvements in computational efficiency and storage make it possible to calculate and store the dense Jacobian matrix in 3D large‐scale gravity inversions. The equivalence relations can be applied to the Taylor series method or combined with the adaptive discretization to ensure high accuracy. To further illustrate the capability of the algorithm, we present a regional synthetic example. The inverted results show density distributions consistent with the actual model. The computation took about 6.3 hours and 0.88 GB of memory compared with about a dozen days and 245.86 GB for the traditional 3D GLQ method. Finally, the proposed algorithm is applied to the gravity field derived from the latest lunar gravity model GL1500E. 3D density distributions of the Imbrium and Serenitatis basins are obtained, and high‐density bodies are found at the depths 10‐60 km, likely indicating a significant uplift of the high‐density mantle beneath the two mascon basins.
Citations
Verde: Processing and gridding spatial data using Green's functions
2018. Uieda, L
Note: This paper marks the release of Verde v1.0.0, a Python library for processing and gridding spatial data. Verde is a part of the new ecosystem of packages in Fatiando a Terra. The peer-review at JOSS is open and can be found on GitHub issue openjournals/joss-reviews#957.
Abstract
Verde is a Python library for gridding spatial data using different
Green's functions. It differs from the radial basis functions in
scipy.interpolate
by providing an API inspired by
scikit-learn. The Verde API should be familiar to scikit-learn users but
is tweaked to work with spatial data, which has Cartesian or geographic
coordinates and multiple data components instead of an X
feature matrix and y
label vector. The library also includes
more specialized Green's functions, utilities for trend estimation and
data decimation (which are often required prior to gridding), and more.
Some of these interpolation and data processing methods already exist in
the Generic Mapping Tools (GMT), a command-line program popular in the
Earth Sciences. However, there are no model selection tools in GMT and it
can be difficult to separate parts of the processing that are done
internally by its modules. Verde is designed to be modular, easily
extended, and integrated into the scientific Python ecosystem. It can be
used to implement new interpolation methods by subclassing the
verde.base.BaseGridder
class, requiring only the
implementation of the new Green's function. For example, it is currently
being used to develop a method for interpolation of 3-component GPS data.
Citations
Fast non-linear gravity inversion in spherical coordinates with application to the South American Moho
2017. Uieda, L, Barbosa, VCF
Note: This paper is one of the chapters of my PhD thesis. It describes a new gravity inversion method to estimate the depth of the crust-mantle interface (the Moho). The inversion builds upon my work on tesseroid modelling.
Abstract
Estimating the relief of the Moho from gravity data is a computationally
intensive non-linear inverse problem. What is more, the modeling must take the
Earths curvature into account when the study area is of regional scale or
greater. We present a regularized non-linear gravity inversion method that has
a low computational footprint and employs a spherical Earth approximation. To
achieve this, we combine the highly efficient Bott's method with smoothness
regularization and a discretization of the anomalous Moho into tesseroids
(spherical prisms). The computational efficiency of our method is attained by
harnessing the fact that all matrices involved are sparse. The inversion
results are controlled by three hyper-parameters: the regularization parameter,
the anomalous Moho density-contrast, and the reference Moho depth. We
estimate the regularization parameter using the method of hold-out
cross-validation. Additionally, we estimate the density-contrast and the
reference depth using knowledge of the Moho depth at certain points. We apply
the proposed method to estimate the Moho depth for the South American
continent using satellite gravity data and seismological data. The final Moho
model is in accordance with previous gravity-derived models and seismological
data. The misfit to the gravity and seismological data is worse in the Andes
and best in oceanic areas, central Brazil and Patagonia, and along the
Atlantic coast. Similarly to previous results, the model suggests a thinner
crust of 30-35 km under the Andean foreland basins. Discrepancies with the
seismological data are greatest in the Guyana Shield, the central Solimões
and Amazonas Basins, the Paraná Basin, and the Borborema province. These
differences suggest the existence of crustal or mantle density anomalies that
were unaccounted for during gravity data processing.
Citations
Tesseroids: forward modeling gravitational fields in spherical coordinates
2016. Uieda, L, Barbosa, VCF, Braitenberg, C
Note: This paper describes the algorithms used in version 1.2.0 of the open-source software Tesseroids. It's also one of the chapters of my PhD thesis.
Abstract
We present the open-source software Tesseroids, a set of command-line programs to perform the forward modeling of gravitational fields in spherical coordinates. The software is implemented in the C programming language and uses tesseroids (spherical prisms) for the discretization of the subsurface mass distribution. The gravitational fields of tesseroids are calculated numerically using the Gauss-Legendre Quadrature (GLQ). We have improved upon an adaptive discretization algorithm to guarantee the accuracy of the GLQ integration. Our implementation of adaptive discretization uses a "stack" based algorithm instead of recursion to achieve more control over execution errors and corner cases. The algorithm is controlled by a scalar value called the distance-size ratio (D) that determines the accuracy of the integration as well as the computation time. We determined optimal values of D for the gravitational potential, gravitational acceleration, and gravity gradient tensor by comparing the computed tesseroids effects with those of a homogeneous spherical shell. The values required for a maximum relative error of 0.1% of the shell effects are D = 1 for the gravitational potential, D = 1.5 for the gravitational acceleration, and D = 8 for the gravity gradients. Contrary to previous assumptions, our results show that the potential and its first and second derivatives require different values of D to achieve the same accuracy. These values were incorporated as defaults in the software.
Citations
How two gravity-gradient inversion methods can be used to reveal different geologic features of ore deposit — A case study from the Quadrilátero Ferrífero (Brazil)
2016. Carlos, DU, Uieda, L, Barbosa, VCF
doi:10.1016/j.jappgeo.2016.04.011
Abstract
Airborne gravity gradiometry data have been recently used in mining surveys to map the 3D geometry of ore deposits. This task can be achieved by different gravity-gradient inversion methods, many of which use a voxel-based discretization of the Earth's subsurface. To produce a unique and stable solution, an inversion method introduces particular constraints. One constraining inversion introduces a depth-weighting function in the first-order Tikhonov regularization imposing a smoothing on the density-contrast distributions that are not restricted to near-surface regions. Another gravity-gradient inversion, the method of planting anomalous densities, imposes compactness and sharp boundaries on the density-contrast distributions. We used these two inversion methods to invert the airborne gravity-gradient data over the iron-ore deposit at the southern flank of the Gandarela syncline in Quadrilátero Ferrífero (Brazil). Because these methods differ from each other in the particular constraint used, the estimated 3D density-contrast distributions reveal different geologic features of ore deposit. The depth-weighting smoothing inversion reveals variable dip directions along the strike of the retrieved iron-ore body. The planting anomalous density inversion estimates a compact iron-ore mass with a single density contrast, which reveals a variable volume of the iron ore along its strike increasing towards the hinge zone of the Gandarela syncline which is the zone of maximum compression. The combination of the geologic features inferred from each estimate leads to a synergistic effect, revealing that the iron-ore deposit is strongly controlled by the Gandarela syncline.
Citations
Estimation of the total magnetization direction of approximately spherical bodies
2015. Oliveira Jr, VC, Sales, DP, Barbosa, VCF, Uieda, L
Note: This paper has undergone open peer-review. The original submission, reviews, and replies can be viewed at the journal website.
Abstract
We have developed a fast total-field anomaly inversion to estimate the magnetization direction of multiple sources with approximately spherical shapes and known centres. Our method is an overdetermined inverse problem that can be applied to interpret multiple sources with different but homogeneous magnetization directions. It requires neither the prior computation of any transformation-like reduction to the pole nor the use of regularly spaced data on a horizontal grid. The method contains flexibility to be implemented as a linear or non-linear inverse problem, which results, respectively, in a least-squares or robust estimate of the components of the magnetization vector of the sources. Applications to synthetic data show the robustness of our method against interfering anomalies and errors in the location of the sources' centre. Besides, we show the feasibility of applying the upward continuation to interpret non-spherical sources. Applications to field data over the Goiás alkaline province (GAP), Brazil, show the good performance of our method in estimating geologically meaningful magnetization directions. The results obtained for a region of the GAP, near to the alkaline complex of Diorama, suggest the presence of non-outcropping sources marked by strong remanent magnetization with inclination and declination close to −70.35 and −19.81°, respectively. This estimated magnetization direction leads to predominantly positive reduced-to-the-pole anomalies, even for other region of the GAP, in the alkaline complex of Montes Claros de Goiás. These results show that the non-outcropping sources near to the alkaline complex of Diorama have almost the same magnetization direction of those ones in the alkaline complex of Montes Claros de Goiás, strongly suggesting that these sources have been emplaced in the crust within almost the same geological time interval.
Citations
Imaging iron ore from the Quadrilátero Ferrífero (Brazil) using geophysical inversion and drill hole data
2014. Carlos, DU, Uieda, L, Barbosa, VCF
doi:10.1016/j.oregeorev.2014.02.011
Abstract
The Quadrilátero Ferrífero in southeastern Brazil hosts one of the largest concentrations of lateritic iron ore deposits in the world. Our study area is over the southern flank of the Gandarela syncline which is one of the regional synclines of the Quadrilátero Ferrífero. The Gandarela syncline is considered the Brazilian megastructure with the highest perspectives for iron ore exploration. Most of the iron-ore deposits from the Quadrilátero Ferrífero are non-outcropping bodies hosted in the oxidized, metamorphosed and heterogeneously deformed banded iron formations. Therefore, the assessment of the 3D geometry of the iron-ore body is of the utmost importance for estimating reserves and production development planning. We carried out a quantitative interpretation of the iron-ore deposit of the southern flank of the Gandarela syncline using a 3D inversion of airborne gravity-gradient data to estimate the shape of the iron-ore mineralization. The retrieved body is characterized by a high-density zone associated with the northeast-elongated iron formation. The thickness and the width of the retrieved iron-ore body vary along its strike increasing southwestward. The presence of a large volume of iron ore in the southwest portion of the study area may be due to the hinge zone of the Gandarela syncline, which is the zone of maximum compression. Our estimated iron-ore mass reveals variable dip directions. In the southernmost, central and northernmost portions of the study area, the estimated iron body dips, respectively, inwards, vertically and outwards with respect to the syncline axis. Previous geological mapping indicated continuous mineralization. However, our result suggests a quasi-continuous iron-ore body. In the central part of the study area, the estimated iron-ore body is segmented into two parts. This breakup may be related to the northwest-trending faults, which are perpendicular to the northeast-trending axis of the Gandarela syncline. Our estimated iron-ore mass agrees reasonably well with the information provided from the lithologic logging data of drill holes. In this geophysical study, the estimated iron-ore reserves are approximately 3 billion tons.
Citations
Estimating the nature and the horizontal and vertical positions of 3D magnetic sources using Euler deconvolution
2013. Melo, FF, Barbosa, VCF, Uieda, L, Oliveira Jr, VC, Silva, JBC
Abstract
We have developed a new method that drastically reduces the number of the source location estimates in Euler deconvolution to only one per anomaly. Our method employs the analytical estimators of the base level and of the horizontal and vertical source positions in Euler deconvolution as a function of the x- and y-coordinates of the observations. By assuming any tentative structural index (defining the geometry of the sources), our method automatically locates plateaus, on the maps of the horizontal coordinate estimates, indicating consistent estimates that are very close to the true corresponding coordinates. These plateaus are located in the neighborhood of the highest values of the anomaly and show a contrasting behavior with those estimates that form inclined planes at the anomaly borders. The plateaus are automatically located on the maps of the horizontal coordinate estimates by fitting a first-degree polynomial to these estimates in a moving-window scheme spanning all estimates. The positions where the angular coefficient estimates are closest to zero identify the plateaus of the horizontal coordinate estimates. The sample means of these horizontal coordinate estimates are the best horizontal location estimates. After mapping each plateau, our method takes as the best structural index the one that yields the minimum correlation between the total-field anomaly and the estimated base level over each plateau. By using the estimated structural index for each plateau, our approach extracts the vertical coordinate estimates over the corresponding plateau. The sample means of these estimates are the best depth location estimates in our method. When applied to synthetic data, our method yielded good results if the bodies produce weak- and mid-interfering anomalies. A test on real data over intrusions in the Goiás Alkaline Province, Brazil, retrieved sphere-like sources suggesting 3D bodies.
Citations
Polynomial equivalent layer
2013. Oliveira Jr, VC, Barbosa, VCF, Uieda, L
Abstract
We have developed a new cost-effective method for processing large-potential-field data sets via the equivalent-layer technique. In this approach, the equivalent layer is divided into a regular grid of equivalent-source windows. Inside each window, the physical-property distribution is described by a bivariate polynomial. Hence, the physical-property distribution within the equivalent layer is assumed to be a piecewise polynomial function defined on a set of equivalent-source windows. We perform any linear transformation of a large set of data as follows. First, we estimate the polynomial coefficients of all equivalent-source windows by using a linear regularized inversion. Second, we transform the estimated polynomial coefficients of all windows into the physical-property distribution within the whole equivalent layer. Finally, we premultiply this distribution by the matrix of Green's functions associated with the desired transformation to obtain the transformed data. The regularized inversion deals with a linear system of equations with dimensions based on the total number of polynomial coefficients within all equivalent-source windows. This contrasts with the classical approach of directly estimating the physical-property distribution within the equivalent layer, which leads to a system based on the number of data. Because the number of data is much larger than the number of polynomial coefficients, the proposed polynomial representation of the physical-property distribution within an equivalent layer drastically reduces the number of parameters to be estimated. By comparing the total number of floating-point operations required to estimate an equivalent layer via our method with the classical approach, both formulated with Cholesky's decomposition, we can verify that the computation time required for building the linear system and for solving the linear inverse problem can be reduced by as many as three and four orders of magnitude, respectively. Applications to synthetic and real data show that our method performs the standard linear transformations of potential-field data accurately.
Citations
Robust 3D gravity gradient inversion by planting anomalous densities
2012. Uieda, L, Barbosa, VCF
Note:
This was my first publication in a scientific journal and the topic of my
Masters dissertation. The inversion method proposed in this paper is
implemented in the open-source Fatiando a Terra Python library as the
fatiando.gravmag.harvester
module, introduced in version
0.1.
Abstract
We have developed a new gravity gradient inversion method for estimating a 3D density-contrast distribution defined on a grid of rectangular prisms. Our method consists of an iterative algorithm that does not require the solution of an equation system. Instead, the solution grows systematically around user-specified prismatic elements, called “seeds,” with given density contrasts. Each seed can be assigned a different density-contrast value, allowing the interpretation of multiple sources with different density contrasts and that produce interfering signals. In real world scenarios, some sources might not be targeted for the interpretation. Thus, we developed a robust procedure that neither requires the isolation of the signal of the targeted sources prior to the inversion nor requires substantial prior information about the nontargeted sources. In our iterative algorithm, the estimated sources grow by the accretion of prisms in the periphery of the current estimate. In addition, only the columns of the sensitivity matrix corresponding to the prisms in the periphery of the current estimate are needed for the computations. Therefore, the individual columns of the sensitivity matrix can be calculated on demand and deleted after an accretion takes place, greatly reducing the demand for computer memory and processing time. Tests on synthetic data show the ability of our method to correctly recover the geometry of the targeted sources, even when interfering signals produced by nontargeted sources are present. Inverting the data from an airborne gravity gradiometry survey flown over the iron ore province of Quadrilátero Ferrífero, southeastern Brazil, we estimated a compact iron ore body that is in agreement with geologic information and previous interpretations.
Citations
Conference proceedings
A Single Euler Solution Per Anomaly
2014. Melo, FF, Barbosa, VCF, Uieda, L, Oliveira Jr, VC, Silva, JBC
76th EAGE Conference and Exhibition
doi:10.3997/2214-4609.20140891
Abstract
We developed a method that drastically reduces the number of the source location estimates in Euler deconvolution to only one per anomaly. We use the analytical estimators of the Euler solutions. Our approach consists in detecting automatically the regions of the anomaly producing consistent estimates of the source horizontal coordinates. These regions form plateaus in the horizontal coordinate estimates using any structural index (defining the geometry of the sources). We identify these plateaus by fitting a first-degree polynomial to the horizontal coordinate estimates with a moving-window operator which spans these estimates. The places where the angular-coefficients estimates are closest to zero identify the plateaus of the horizontal coordinate estimates where consistent estimates of the horizontal source positions are found. The sample means of these horizontal coordinate estimates are the best estimates. The best structural index is the one that yields the minimum correlation between the total-field anomaly and the estimated base level over each plateau. By using the estimated structural index for each plateau, we extract the depth estimates over the corresponding plateau. The sample means of these estimates are the best depth estimates. Test on real data over alkaline bodies, central Brazil, retrieved sphere-like sources suggesting 3D bodies.
Citations
Modeling the Earth with Fatiando a Terra
2013. Uieda, L, Oliveira Jr, VC, Barbosa, VCF
Proceedings of the 12th Python in Science Conference
doi:10.25080/Majora-8b375195-010
Abstract
Geophysics is the science of using physical observations of the Earth to infer its inner structure. Generally, this is done with a variety of numerical modeling techniques and inverse problems. The development of new algorithms usually involves copy and pasting of code, which leads to errors and poor code reuse. Fatiando a Terra is a Python library that aims to automate common tasks and unify the modeling pipeline inside of the Python language. This allows users to replace the traditional shell scripting with more versatile and powerful Python scripting. The library can also be used as an API for developing stand-alone programs. Algorithms implemented in Fatiando a Terra can be combined to build upon existing functionality. This flexibility facilitates prototyping of new algorithms and quickly building interactive teaching exercises. In the future, we plan to continuously implement sample problems to help teach geophysics as well as classic and state-of-the-art algorithms.
Citations
Use of the "shape-of-anomaly" data misfit in 3D inversion by planting anomalous densities
2012. Uieda, L, Barbosa, VCF
SEG Technical Program Expanded Abstracts
Abstract
We present an improvement to the method of 3D gravity gradient inversion by planting anomalous densities. This method estimates a density-contrast distribution defined on a grid of right-rectangular prisms. Instead of solving large equation systems, the method uses a systematic search algorithm to grow the solution, one prism at a time, around user-specified prisms called “seeds”. These seeds have known density contrasts and the solution is constrained to be concentrated around the seeds as well as have their density contrasts. Thus, prior geologic and geophysical information are incorporated into the inverse problem through the seeds. However, this leads to a strong dependence of the solution on the correct location, density contrast, and number of seeds used. Our improvement to this method consists of using the “shape-of-anomaly” data-misfit function in conjunction with the l2-norm data-misfit function. The shape-of-anomaly function measures the different in shape between the observed and predicted data and is insensitive to differences in amplitude. Tests on synthetic and real data show that the improved method not only has an increased robustness with respect to the number of seeds and their locations, but also provides a better fit of the observed data.
Citations
Iron ore interpretation using gravity-gradient inversions in the Carajás, Brazil
2012. Carlos, DU, Uieda, L, Li, Y, Barbosa, VCF, Braga, MA, Angeli, G, Peres, G
SEG Technical Program Expanded Abstracts
Abstract
We have interpreted the airborne gravity gradiometry data from Carajás Mineral Province (CMP), Brazil, by using two different 3D inversion methods. Both inversion methods parameterized the Earth's subsurface into prismatic cells and estimate the 3D density-contrast distribution that retrieves an image of geologic sources subject to an acceptable data misfit. The first inversion method imposes smoothness on the solution by solving a linear system that minimizes an depth weighted L2 model objective function of density-contrast distribution. The second imposes compactness on the solution by using an iterative growth algorithm solved by a systematic search algorithm that accretes mass around user-specified prisms called “seeds”. Using these two inversion methods, the interpretation of full tensor gravity gradiometry data from an iron ore deposit in the area named N1 at CMP shows the consistent geometry and the depth of iron orebody. To date, the maximum depth of the iron orebody is assumed to be 200 m based on the maximum depth attained by the deepest well drilled in this study area. However, both inversion results exhibit a source whose maximum bottom depth is greater than 200 m. These results give rise two interpretations: i) the iron orebody may present its depth to the bottom greater than the maximum depth of 200 m attained by the deepest borehole; or ii) the iron orebody may be 200 m deep and the rocks below may be jaspilite whose density is close to that of soft hematite.
Citations
Optimal forward calculation method of the Marussi tensor due to a geologic structure at GOCE height
2011. Uieda, L, Bomfim, EP, Braitenberg, C, Molina, E
Proceedings of the 4th International GOCE User Workshop
Abstract
The new observations of GOCE present a challenge to develop new calculation methods that take into account the sphericity of the Earth. We address this problem by using a discretization with a series of tesseroids. There is no closed formula giving the gravitational fields of the tesseroid and numerical integration methods must be used, such as the Gauss Legendre Cubature (GLC). A problem that arises is that the computation times with the tesseroids are high. Therefore, it is important to optimize the computations while maintaining the desired accuracy. This optimization was done using an adaptive computation scheme that consists of using a fixed GLC order and recursively subdividing the tesseroids. We have obtained an optimum ratio between the size of the tesseroid and its distance from the computation point. Furthermore, we show that this size-to-distance ratio is different for the gravitational attraction than for the gravity gradient tensor.
3D gravity inversion by planting anomalous densities
2011. Uieda, L, Barbosa, VCF
12th International Congress of the Brazilian Geophysical Society
Abstract
This paper presents a novel gravity inversion method for estimating a 3D density-contrast distribution defined on a grid of prisms. Our method consists of an iterative algorithm that does not require the solution of a large equation system. Instead, the solution grows systematically around user-specified prismatic elements called “seeds”. Each seed can have a different density contrast, allowing the interpretation of multiple bodies with different density contrasts and interfering gravitational effects. The compactness of the solution around the seeds is imposed by means of a regularizing function. The solution grows by the accretion of neighboring prisms of the current solution. The prisms for the accretion are chosen by systematically searching the set of current neighboring prisms. Therefore, this approach allows that the columns of the Jacobian matrix be calculated on demand. This is a known technique from computer science called “lazy evaluation”, which greatly reduces the demand of computer memory and processing time. Test on synthetic data and on real data collected over the ultramafic Cana Brava complex, central Brazil, confirmed the ability of our method in detecting sharp and compact bodies.
Citations
Robust 3D gravity gradient inversion by planting anomalous densities
2011. Uieda, L, Barbosa, VCF
SEG Technical Program Expanded Abstracts
Abstract
We present a new gravity gradient inversion method for estimating a 3D density‐contrast distribution defined on a grid of prisms. Our method consists of an iterative algorithm that does not require the solution of a large equation system. Instead, the solution grows systematically around user‐specified prismatic elements called “seeds”. Each seed can be assigned a different density contrast, allowing the interpretation of multiple bodies with different density contrasts and that produce interfering gravitational effects. The compactness of the solution around the seeds is imposed by means of a regularizing function. The solution grows by the accretion of neighboring prisms of the current solution. The prisms for the accretion are chosen by systematically searching the set of current neighboring prisms. Therefore, this approach allows that the columns of the Jacobian matrix be calculated on demand, which greatly reduces the demand of computer memory and processing time. Tests on synthetic data and on real data collected over an iron ore province of Quadrilátero Ferrífero, southeastern Brazil, confirmed the ability of our method in detecting sharp and compact bodies.
Citations
3D gravity gradient inversion by planting density anomalies
2011. Uieda, L, Barbosa, VCF
73th EAGE Conference and Exhibition incorporating SPE EUROPEC
doi:10.3997/2214-4609.20149567
Abstract
This paper presents a novel gravity inversion method for estimating a 3D density-contrast distribution defined on a grid of prisms. Our method consists of an iterative algorithm that does not require the solution of a large equation system. Instead, the solution grows systematically around user-specified prismatic elements called “seeds”. Each seed can have a different density contrast, allowing the interpretation of multiple bodies with different density contrasts and interfering gravitational effects. The compactness of the solution around the seeds is imposed by means of a regularizing function. The solution grows by the accretion of neighboring prisms of the current solution. The prisms for the accretion are chosen by systematically searching the set of current neighboring prisms. Therefore, this approach allows that the columns of the Jacobian matrix be calculated on demand. This is a known technique from computer science called “lazy evaluation”, which greatly reduces the demand of computer memory and processing time. Test on synthetic data and on real data collected over the ultramafic Cana Brava complex, central Brazil, confirmed the ability of our method in detecting sharp and compact bodies.
Citations
In-depth imaging of an iron orebody from Quadrilatero Ferrifero using 3D gravity gradient inversion
2011. Carlos, DU, Uieda, L, Barbosa, VCF, Braga, MA, Gomes, AAS
SEG Technical Program Expanded Abstracts
Abstract
We have interpreted the airborne gravity gradiometry data from Quadrilátero Ferrífero, an iron ore province in southeastern Brazil. Aiming at retrieving the geometry of the iron body, we have used a fast and novel gravity inversion method for estimating a 3D density‐contrast distribution defined on a grid of prisms. This inversion approach combines robust data‐fitting with an iterative procedure that does not require the solution of a large equation system. By using a systematic search algorithm, the estimated mass grows around prismatic elements called “seeds”. The interpreter specifies the locations and the associated density contrasts of the seeds. Automatically, the inversion method fits the observations and favors compact gravity sources closest to the seeds. To produce a more robust data fitting than least‐squares fit, the inversion method minimizes the L1—norm of the residuals. Hence, it allows the presence of large residuals, so that outliers produced by non‐targeted bodies can be handled. By using 126 seeds which were assigned density contrasts of 0.5 g.cm−3 and whose locations were based on our knowledge about the QF area, we have retrieved a continuous elongated iron body that fits the observed components of the gravity gradient. Our inversion result agrees reasonably with previous geophysical interpretations. In addition, our result honors the borehole information about the iron body depth.
Citations
Inversão de Dados de Aerogradiometria Gravimétrica 3D-FTG Aplicada a Exploração Mineral na Região do Quadrilátero Ferrífero
2011. Carlos, DU, Barbosa, VCF, Uieda, L, Braga, MA
12th International Congress of the Brazilian Geophysical Society
Abstract
We have interpreted the airborne gravity gradiometry data from Quadrilátero Ferrífero, an iron ore province in southeastern Brazil. Aiming at retrieving the geometry of the iron body, we have used a fast and novel gravity inversion method for estimating a 3D density-contrast distribution defined on a grid of prisms. This inversion approach combines robust data-fitting with an iterative procedure that does not require the solution of a large equation system. By using a systematic search algorithm, the estimated mass grows around prismatic elements called “seeds”. The interpreter specifies the locations and the associated density contrasts of the seeds. Automatically, the inversion method fits the observations and favors compact gravity sources closest to the seeds. To produce a more robust data fitting than least-squares fit, the inversion method minimizes the L1–norm of the residuals. Hence, it allows the presence of large residuals, so that outliers produced by non-targeted bodies can be handled. By using 126 seeds which were assigned density contrasts of 0.5 g.cm-3 and whose locations were based on our knowledge about the QF area, we have retrieved a continuous elongated iron body that fits the observed components of the gravity gradient. Our inversion result agrees reasonably with previous geophysical interpretations. In addition, our result honors the borehole information about the iron body depth.
Citations
Other
Step-by-step NMO correction
2017. Uieda, L
Note: This is a part of The Leading Edge geophysics tutorials series. All tutorials are open-access and include open-source code examples. The text is also included on the SEG Wiki! The code and idea for this tutorial came from my Introduction to Geophysics courses at UERJ. I came across the problem of implementing NMO correction while preparing my lecture and practical exercises on this topic. This is a clear example of how learning happens both ways in a classroom.
Abstract
Open any text book about seismic data processing and you will inevitably
find a section about the normal moveout (NMO) correction. When applied to
a common midpoint (CMP) section, the correction is supposed to turn the
hyperbola associated with a reflection into a straight horizontal line.
What most text books won't tell you is how, exactly, do you apply
this equation to the data?
Read on and I'll explain step-by-step how the algorithm for NMO
correction from Yilmaz (2001) works and how to implement it in Python.
The accompanying Jupyter notebook contains the full source code, with
documentation and tests for each function.
Citations
Geophysical tutorial: Euler deconvolution of potential-field data
2014. Uieda, L, Oliveira Jr, VC, Barbosa, VCF
Note: This article is part of the Geophysical Tutorials series in The Leading Edge, started by Matt Hall. All tutorials are open-access and include open-source code examples. The February 2016 tutorial by Matt provides an introduction to the series. The tutorial is also available at the SEG wiki where it can edited and improved.
Abstract
In this tutorial we'll talk about a widely used method of interpretation for potential-field data called Euler deconvolution. Our goal is to demonstrate its usefulness and, most importantly, call attention to some pitfalls encountered in the interpretation of the results. The code and synthetic data required to reproduce our results and figures can be found in the accompanying IPython notebooks. The notebooks also expand the analysis presented here. We encourage you to download the data and try it on your software of choice. For this tutorial we'll use the implementation in the open-source Python package Fatiando a Terra.