Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tomography at Advanced Photon Source

Similar presentations


Presentation on theme: "Tomography at Advanced Photon Source"— Presentation transcript:

1 Tomography at Advanced Photon Source
Data deluge is happening in almost all science domains. Light source facilities in US generate several TBs of data per day now and is poised to increased by two orders of magnitude in the next few years. Figure shows the tomography system installed at APS. A sample (inserted by a robot for high-throughput operation) is rotated while synchrotron radiation is passed through it and images are captured by a CCD camera. The current camera-storage bus allows data to be transferred to disk at 100 frames per second (fps), corresponding to 820 MB/s or 67.6 TB/day. Scientific discovery at experimental facilities such as light sources often limited by computational and computing capabilities. EuroPar’15

2 Tomographic Image Reconstruction
Analysis of tomographic datasets is challenging Long image reconstruction/analysis time E.g. 12GB Data, 12 hours with 24 Cores Different reconstruction algorithms Longer computation times Input dataset < Output dataset 73MB vs. 476MB High data acquisition rates E.g. 12 GB < 1sec. 2Kx2K, 16bit/pixel, 1500 projections Advancement in light sources and detectors Expected to increase several orders of magnitude Beam time at light source facilities is a precious commodity, booked months in advance and typically for only a relatively short period of time (a day or perhaps a week). The data need to be analyzed rapidly so that results from one experiment can guide selection of the next—or even influence the course of a single experiment. Otherwise, the users collect as much as they can in the given time – sometimes the data is not useful, sometimes they collect more data than they need. Iterative reconstruction algorithms are computationally intensive and the output data size of the reconstruction is greater than the input data size. EuroPar’15

3 Processing Structure Inputs IS: Assigned projection slices
Recon: Reconstruction object dist: Subsetting distance Output Recon: Final reconstruction object /* (Partial) iteration i */ For each assigned projection slice, is, in IS { IR = GetOrderedRaySubset(is, i, dist); For each ray, ir, in rays IR { (k, off, val) = LocalRecon(ir, Recon(is)); ReconRep(k) = Reduce (ReconRep(k), off, val); } /* Combine updated replicas */ Recon = PartialCombination(ReconRep) /* Exchange and update adjacent slices*/ Recon = GlobalCombination(Recon) Recon[i] Node 0 Partial Combination() Local Recon(…) Recon Rep Thread m Partial Combination Phase Node n Thread 0 Local Reconstruction Phase Global Combination Phase Iteration i Projs. Recon [i-1] Inputs Tomography datasets are stored by using a scientific data format such as HDF5 [6]. Before beginning reconstruction, our middleware reads metadata information from the input dataset and allocates the resources required for the output dataset. Our middleware eliminates these race conditions by creating a replica of the output object for each thread, which we refer to as ReconRep in Fig. 3(a). Local reconstruction corresponds to the mapping phase of the MapReduce processing structure. The user implements the reconstruction algorithms in the local reconstruction function (LocalRecon in Fig. 3(a)); our middleware applies this function to each assigned data chunk. Each data chunk can be a set of slices (for per-slice parallelization) or a subset of rays in a slice (for in-slice parallelization). After all the rays in the assigned data chunks are processed, the threads that are working on the same slices synchronize and combine their replicas by using a user-defined function EuroPar’15


Download ppt "Tomography at Advanced Photon Source"

Similar presentations


Ads by Google