Multi-modal Imaging for Portal-based Screening Multispectral Methods for Diffraction Tomography
F3-B (Phase 1)

Download the 2013 Project Report

This project investigates the development of automated explosives detection and classification algorithms for increased throughput by using combinations of sensors in an active, adaptive testing scheme. Multi-modal sensors can help find and distinguish the features of existing threats, and even discover and classify new ones. The significance of this project lies in the potential to use multiple modalities fused together to detect the presence of explosives, for both portal and stand-off systems, and then to classify their natures as specifically and sensitively as possible. In our recent work, we have developed new theories for increasing the signal/noise ratio in diffraction tomography using sensors that collect measurements at multiple frequencies, by adapting techniques previously exploited for multi-modal imaging in medical applications. For X-ray systems, this includes investigating the use of multi-energy X-ray techniques and X-ray diffraction tomography. This past year, we have continued our work on fusion of X-ray diffraction tomography along with conventional X-ray computed tomography, in order to extract information from coherent scatter of materials to fuse with the conventional CT absorption images. The focus has been on exploring architectures that provide stronger signals, included coded aperture imaging, and consider the full spectral characteristics of X-ray sources and detectors.

My involvement with ALERT has modified my research agenda to integrate far greater knowledge of security threats and sensing modalities that are capable of extracting relevant information for detection and classification of such threats.
- Project Leader, David Castanon
Project Leader
  • David Castañón
    Professor
    Boston University
    Email

Faculty and Staff Currently Involved in Project
  • W. Clem Karl
    Professor
    Boston University
    Email

Students Currently Involved in Project
  • Ke Chen
    Boston University