The evolution of the down wave. We hit the earth with some sort of an impact which results in a movement. Since the earth is elastic, it rebounds past.

Slides:



Advertisements
Similar presentations
Introducing the piggy back noise problem. This is what we saw visually on the 3D gather data. Strong and persistent cps waves riding on very low.
Advertisements

Mathematics in Engineering Education 1. The Meaning of Mathematics 2. Why Math Education Have to Be Reformed and How It Can Be Done 3. WebCT: Some Possibilities.
Shapelets Correlated with Surface Normals Produce Surfaces Peter Kovesi School of Computer Science & Software Engineering The University of Western Australia.
1.The seismic energy continuum consists of thousands of independent primary reflections, each coming from a single reflecting interface. 2. There is no.
Quantum One: Lecture 6. The Initial Value Problem for Free Particles, and the Emergence of Fourier Transforms.
I was there when it all happened, a long time ago. Of course I was on the opposite side commercially. Time series limitations - Frequency domain methods.
NASSP Masters 5003F - Computational Astronomy Lecture 5: source detection. Test the null hypothesis (NH). –The NH says: let’s suppose there is no.
Fourier Transform A Fourier Transform is an integral transform that re-expresses a function in terms of different sine waves of varying amplitudes, wavelengths,
GG450 April 22, 2008 Seismic Processing.
De-noising Vibroseis The contention is that the correlated Vibroseis field record (to the far left) is a mess of overlapping coherent noise and signal,
Lecture 17 spectral analysis and power spectra. Part 1 What does a filter do to the spectrum of a time series?
Spike-triggering stimulus features stimulus X(t) multidimensional decision function spike output Y(t) x1x1 x2x2 x3x3 f1f1 f2f2 f3f3 Functional models of.
1 Introduction to Communications Professor R. C. T. Lee Dept. of Information Management Dept. of Computer Science Department of Communications Department.
Basic Concepts and Definitions Vector and Function Space. A finite or an infinite dimensional linear vector/function space described with set of non-unique.
Another example of critical angle refraction noise.
Quantum theory and Consciousness This is an interactive discussion. Please feel free to interrupt at any time with your questions and comments.
Measures of Central Tendency
Shale Lime Sand The argument for non-linear methods. 1 The geology 2. The reflection coefficients (spikes in non-linear lingo). 3. The down wave 4. Its.
So where does the ADAPS optimization method fit into this framework? The seismic trace equation does not fit the linear definition, since the values on.
Seismic is not too complex for geologists - If you can understand convolution, you have it made. Simply stated, when downward traveling waves pass by a.
Why determining an exact waveform is next to impossible. We start at the recording point with these facts – 1. The seismic continuum consists of overlapping.
Lecture 1 Signals in the Time and Frequency Domains
Interpolation. Interpolation is important concept in numerical analysis. Quite often functions may not be available explicitly but only the values of.
Portfolio Management Lecture: 26 Course Code: MBF702.
CORRELATION & REGRESSION
Welcome to a before and after coherent noise removal series. There is a lot to explain about what is going on here, so I am using this otherwise wasted.
Chang-Kui Duan, Institute of Modern Physics, CUPT 1 Harmonic oscillator and coherent states Reading materials: 1.Chapter 7 of Shankar’s PQM.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
A list of inversion error causes that all attribute junkies should really understand: 1.Definition of inversion – A seismic trace is the product of the.
Low Level Visual Processing. Information Maximization in the Retina Hypothesis: ganglion cells try to transmit as much information as possible about the.
Measures of Variability In addition to knowing where the center of the distribution is, it is often helpful to know the degree to which individual values.
Lecture 9 Fourier Transforms Remember homework 1 for submission 31/10/08 Remember Phils Problems and your notes.
ADAPS optimized stack of line 401 (no inversion or integration). Please toggle with conditioned version I start with the Paige optimized stack v.s. the.
Signals CY2G2/SE2A2 Information Theory and Signals Aims: To discuss further concepts in information theory and to introduce signal theory. Outcomes:
Digital Photography Tips on Composition. Framing Your Shots Rule of Thirds Working the Lines Finding Fresh Angles Getting Horizons Horizontal Getting.
Solution of the Inverse Problem for Gravitational Wave Bursts Massimo Tinto JPL/CIT LIGO Seminar, October 12, 2004 Y. Gursel & M. Tinto, Phys. Rev. D,
Introduction to Deconvolution
23 November Md. Tanvir Al Amin (Presenter) Anupam Bhattacharjee Department of Computer Science and Engineering,
The Bohr Model and the Quantum Mechanical Model of the Atom
Light and the Problem of Measurement 20 th Century physics changed science by realizing that you can only know something if you can measure it. In some.
Measures of variability: understanding the complexity of natural phenomena.
Because noise removal is central to my later work, I start with a discussion on how intertwined coherent noise creates a random effect that confuses all.
Autonomous Robots Vision © Manfred Huber 2014.
ADAPS multiple removal demo. The upper halves of the slides in this series show the input gathers.The bottoms show the same data with multiples removed.
9. As hazardous as California? USGS/FEMA: Buildings should be built to same standards How can we evaluate this argument? Frankel et al., 1996.
Visual interpretation is still the best. You can see reservoir possibilities that have been missed, and do it at a fraction of the normal cost. The tougher.
Lecture 8 Source detection NASSP Masters 5003S - Computational Astronomy
Welcome to a wild ride through ideas. In this show I am suggesting that the multiple fractures associated with strike slip faulting can accomplish the.
Why inversion & integration is needed to see stratigraphy.
Can we see shale fractures? Some claim to already doing it but I have my doubts. I believe I’m close, but not there yet. I’d like opinions. The section.
 Introduction to reciprocal space
Probing the question of “how good can seismic get? I ask you to spend a good amount of time just looking at the amazing detail this section shows. When.
Copyright © Cengage Learning. All rights reserved. 8 9 Correlation and Regression.
The Frequency Domain Digital Image Processing – Chapter 8.
Copyright © Cengage Learning. All rights reserved.
Fourier series With coefficients:.
LECTURE 11: Advanced Discriminant Analysis
Data Processing As a science major, you will all eventually have to deal with data. All data has noise Devices do not give useful measurements; must convert.
Concept test 15.1 Suppose at time
Filtering Geophysical Data: Be careful!
All about convolution.
Outline Linear Shift-invariant system Linear filters
What Is Spectral Imaging? An Introduction
DCSP-3: Fourier Transform (continuous time)
Quantum One.
Wavelet transform Wavelet transform is a relatively new concept (about 10 more years old) First of all, why do we need a transform, or what is a transform.
BEFORE AFTER Let’s start by examining this particular sonic log match. We have super-imposed it both on the input (before) and on the output (after). Some.
Motion-Based Analysis of Spatial Patterns by the Human Visual System
Thermal Energy & Heat Capacity:
Inferential Statistics
Presentation transcript:

The evolution of the down wave. We hit the earth with some sort of an impact which results in a movement. Since the earth is elastic, it rebounds past its original state creating an oscillation. As this 3-dimensional shape expands, resistance tempers the sharpness of the the wave front, and further oscillation lengthens the disturbance. As it passes through the typical layered subsurface, reflections are created at each velocity interface, each seeming to be independent. Because the down wave has become leggy, the algebraic addition of all these closely spaced events creates a jumbled and complex pattern. However detuning this mess is another subject. Our down wave continues. Time series mathematics has allowed us to describe these shapes in terms of frequency. This is accomplished by modeling processes called transforms. The time series is convolved with a series of frequencies to measure correlation coefficients at each step. The result is a power spectrum. The great leap in mathematical logic was to assume these coefficients actually represent pure frequency content that can be used in filter design. When the layman thinks of a 100 cps component from the resulting spectrum the tendency is to visualize a discrete wavelet. This could be true of course, but but the fact is that this particular frequency, along with its partners, might have only been needed to to either model the dampening of the the averaged waveform, or model a complex shape that is made up entirely of lower dominant energy. Obviously if the spectrum peaked at 100 cps this would suggest real lobes at that frequency. In any case some doubt should exist. The supposed beauty of the modeling is that it allows doing filter computation in the frequency domain, which is relatively simple. Knowing the data spectrum one can design some desired spectrum and mathematically create a transforming filter to remove any unwanted energy. In the simplest case of a strong ring, the modeling will probably be effective, showing a sharp velocity peak at the ringing frequency. As soon as the down wave starts to dampen, other modeling frequencies have to be brought into play. When it (the down wave) begins to get more complex (with lobes transitioning from higher to lower dominant frequencies) the modeling task becomes much tougher, with the vital shape information being spread thinly over a wide frequency range and thus susceptible to noise.. A defense of my claims that the ADAPS non linear system can materially improve deep resolution – David Paige I start with a quote from a recent spirited exchange – “I understand your arguments about detuning, but you suggest that you can get back to the original spike train using your method.It is fine to calculate the spikes as you do but they are still limited by the seismic data frequency content as to the uncertainty in their position and size. The mathematics behind this statement is exactly the same principle as the Heisenberg Uncertainty principle, which deals with the uncertainty between position and momentum of time and energy, both of which are Fourier transform pairs. My argument is that you can calculate and integrate the spikes but what comes out is not better than if you had created a suitable broad band wavelet and created an integrated trace as in the Sparse Spike method.” I feel certain my correspondent speaks for many seismic researchers. I envy his sublime faith in current seismic theory. In contrast I’m always questioning what I have done. I admit to being surprised at how well my logic works under tough data circumstances, and I continue to learn just watching it do its thing. In fact I admit I believe in black boxes. When faced with good results from one I want to know what I missed, and what that logic knows that I don’t. It should be obvious that I am not up to arguing at his mathematical level, but there are some things I think I know that he might not. I have had others tell me that ADAPS could not honestly do what it does because of frequency limitations. When I point to the before and after well matches their certainty is not shaken. However those results fortify me to the point where I will argue with some basic tenets. ADAPS depends on advanced pattern recognition in spike positioning. The shape of the estimated down wave is our major concern. When the frequency content changes with depth the shape is obviously affected. An enormous amount of statistical effort is spent getting the best possible down wave guess in the target zone. Spikes on each trace are computed independently, so continuity of results becomes a logical proof. The way I learned what made the transform work was by tearing apart the early code (I was around way back then). Of course this analysis is old. In my own simple minded way I consider these computed frequencies to be “descriptive” (in the modeling sense). Sometimes they might point out offending events and sometimes they don’t. High frequency loss at depth is partly a function on the dampening of the leading edge of the down wave. As long as we can simulate the down wave shape accurately in the target zone, error in positioning should not be hyper sensitive to frequency. This basic disagreement led to the spirited part of the discussion, and we parted with an agreement to disagree. Before discussing filtering let me return to the under-lined argument above. Much of the resolution power of ADAPS comes from its ability to integrate its spikes. To do this, the spikes must be unique to the reflecting interface. I respectfully submit that integrating the entire wavelet would not get the job done, and that the ability to optimize true spikes is essential. I believe. ADAPS may stand alone in this capability. Finally, my doubts on the true value of the transformed spectrum contribute to my dislike of frequency sensitive filtering. In ADAPS we predict and gently lift off the noise. Click on oval for example set (give it time).