Data level fusion using common symmetry set


Tari S.

Conference on Sensor Fusion - Architectures, Algorithms, and Applications III, Florida, United States Of America, 7 - 09 April 1999, vol.3719, pp.327-331 identifier

  • Publication Type: Conference Paper / Full Text
  • Volume: 3719
  • Doi Number: 10.1117/12.341354
  • City: Florida
  • Country: United States Of America
  • Page Numbers: pp.327-331

Abstract

Availability of different imaging modalities requires techniques to process and combine information from different images of the same phenomena. We present a symmetry based approach for combining information from multiple images. Fusion is performed at data level. Actual object boundaries and shape descriptors are recovered directly from raw sensor output(s). Method is applicable to arbitrary number of images in arbitrary dimension.