Presentation is loading. Please wait.

Presentation is loading. Please wait.

Fig. 2. The structure of a binocular depth perception system1.

Similar presentations


Presentation on theme: "Fig. 2. The structure of a binocular depth perception system1."— Presentation transcript:

1 Fig. 2. The structure of a binocular depth perception system1.
Extraction of 3D Information by Use of Multispectral Imaging José I. López, MS student (UPRM) Raúl E. Torres, PhD. (UPRM), Miguel Vélez, PhD. (UPRM), GOALS Obtain 3D information, particularly that of shape and depth, by analyzing multispectral images. From a single camera and field of view we like to make an accurate estimation of the distance between the camera and a dense media such as water, by using multi- spectral imaging. Extract depth information without the use of stereo vision or multi-views application. The main considered approach will be similar to range-finding methods without the need of having to modify the multi-spectral imaging system. Fig. 8. Representation of the experiments. The first without the medium, second with the medium and third without a pure white object INTRODUCTION Range-finding tools provide very important information regarding an object’s position in space. Currently, available range-finding methods include: stereovision, shape from shading, image defocusing, movable cameras, and laser triangulation systems. The second experiment consisted of putting the same object in a liquid medium to observe the absorption coefficient of the liquid. In this case, the maximum intensity pixel is not the same for all bands. When the object is observed through liquid, the perceived intensity becomes a function of depth in addition to other parameter such as spectral characteristic and the light sources. A third experiment was implemented using the concepts of the first two experiments, but without a pure white object. This experiment analyzed the bands that cross the object. We investigated a potential relationship between the background of the object and the band that was absorbed by the liquid medium after crossing the object. This could give us additional distance information to locate the object and hence estimated the size of object. FUTURE PLANS Development of an algorithm for the analysis of multispectral images that can extract valuable 3D data. Development of a mathematical equation system for the extraction of 3D data from multispectral images. Implementation of the algorithm for the analysis of underwater objects. Comparison of the efficiency of the proposed method with that of already available range-finding tools. Fig. 2. The structure of a binocular depth perception system1. Fig. 1. Optical triangulation geometry. The angle θ is the triangulation angle while α is the tilt of the sensor plane needed to keep the laser plane in focus3. BACKGROUND The reflectance of the object depends on its distance from the camera, this means, if the object has high reflectance and illumination, the closer it is to the camera. In the same way, as the object fades away from the camera, the angle of reflection increases, the color intensity decreases which makes it look darker. A possible way of obtaining 3D information from multispectral images is through differential light absorption. REFERENCES [1]T. Yamakawa, K. Shimonomura, T. Udono, and T. Yagi, "Depth Perception Circuit Employing Serial Output Signals from Two Vision Chips,“ Systems, Man, and Cybernetics, IEEE SMC '99 Conference Proceedings. Vol. 4, pp , 1999 [2]D. Laurendeau, R. Houde, M. Samson, and D. Poussart "3-D Range Acquisition Through Differential Light Absorption,"IEEE Transactions on Instrumentation and Measurement, Vol. 41, No. 5, October 1992 [3]Brian Curless, Marc Levoy, Better Optical Triangulation through Spacetime Analysis, Conference on Computer Vision '95, pp , June, 1995. [4]Eladio Rodríguez-Díaz, Luis O. Jiménez-Rodríguez, Miguel Vélez-Reyes, Fernando Gilbes, Charles A. DiMarzio, Subsurface Detection of Coral Reefs in Shallow Waters using Hyperspectral Data, Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery IX, Sylvia S. Shen, Paul E. Lewis, Editors, Proceedings of SPIE Vol (2003) SPIE · X/03 Fig. 3. Representation of the effect of distance in the intensity Parts of the Goals of R2C, is discrimination of objects submerged in a multi-layered complex media, such as coral reefs in shallow waters, using multispectral imaging. The extraction of 3D information could help us to determine the depth of the object of interest and it’s position relative to it’s surroundings. We will apply this approach to the monitoring of Coral reefs and other underwater habitats (S4). Relevance to the CenSSIS Strategic Research Plan R1 R2 Fundamental Science Validating TestBEDs L1 L2 L3 R3 S1 S4 S5 S3 S2 Bio - Med Enviro Civil Fig. 4. Geometry for range from absorption2. Fig. 5. Side view of a range finder using differential light absorption2. For any to other to underwater target, is affected by the inherent absorption and scattering properties of the water, along with absorption and scattering of chlorophyll, organic material and you suspend sediment present in the medium Fig. 6. Dry Spectrum of the target4 Fig. 7. Spectrum of the targets in water medium4 Approach The first experiment consisted in measuring the spectral response of a particular object cover with white reflectance coating to have a uniform spectral signature. The result of this experiment was that exist an inverse proportional relationship between brightness of pixels and depth of the object and the orientation of the surface increase relative to the optical axis. In each band we saw maximum intensity pixels that correspond to the sensor. CONTACT INFO Raúl E. Torres, PHD Miguel Vélez, PHD This work was supported in part by CenSSIS, the Center for Subsurface Sensing and Imaging Systems, under the Engineering Research Centers Program of the National Science Foundation (Award Number EEC ).

2


Download ppt "Fig. 2. The structure of a binocular depth perception system1."

Similar presentations


Ads by Google