Publications

Deep Learning Methods Toward Generalized Change Detection on Planetary Surfaces

In review, 2019

Ongoing Mars exploration missions are returning large volumes of image data. Identifying surface changes in these images, e.g., new impact craters, is critical for investigating many scientific hypotheses. Traditional approaches to change detection rely on image differencing and manual feature engineering. These methods can be sensitive to irrelevant variations in illumination or image quality and typically require before and after images to be co-registered, which itself is a major challenge. To overcome these limitations, we propose novel deep learning approaches to change detection that rely on transfer learning, convolutional autoencoders, and Siamese networks. Our experiments show that these approaches can detect meaningful changes with high accuracy despite significant differences in illumination, image quality, imaging sensors, and alignment between before and after images. We show that the latent representations learned by a convolutional autoencoder yield the most general representations for detecting change across surface feature types, scales, sensors, and planetary bodies.

Recommended citation: Kerner, H. R., Wagstaff, K. L., Bue, B., Gray, P. C., Bell III, J. F., Ben Amor, H. (2019). "Deep Learning Methods Toward Generalized Change Detection on Planetary Surfaces." In Review.

Novelty Detection for Multispectral Images with Application to Planetary Exploration

Innovative Applications of Artificial Intelligence (IAAI/AAAI), 2019

In this work, we present a system based on convolutional autoencoders for detecting novel features in multispectral images. We introduce SAMMIE: Selections based on Autoencoder Modeling of Multispectral Image Expectations. Previous work using autoencoders employed the scalar reconstruction error to classify new images as novel or typical. We show that a spatial-spectral error map can enable both accurate classification of novelty in multispectral images as well as human-comprehensible explanations of the detection. We apply our methodology to the detection of novel geologic features in multispectral images of the Martian surface collected by the Mastcam imaging system on the Mars Science Laboratory Curiosity rover.

Recommended citation: Kerner, H. R., Wellington, D. F., Wagstaff, K. L., Bell III, J. F., Ben Amor, H. (2019). "Novelty Detection for Multispectral Images with Application to Planetary Exploration." In Proc. of Innovative Applications of Artificial Intelligence (IAAI/AAAI) 2019.

Context-Dependent Image Quality Assessment of JPEG-Compressed Mars Science Laboratory Mastcam Images using Convolutional Neural Networks

Computers and Geosciences, 2018

The Mastcam color imaging system on the Mars Science Laboratory Curiosity rover acquires images that are often JPEG compressed before being downlinked to Earth. Depending on the context of the observation, this compression can result in image artifacts that might introduce problems in the scientific interpretation of the data and might require the image to be retransmitted losslessly. We propose to streamline the tedious process of manually analyzing images using context-dependent image quality assessment, a process wherein the context and intent behind the image observation determine the acceptable image quality threshold. We propose a neural network solution for estimating the probability that a Mastcam user would find the quality of a compressed image acceptable for science analysis. We also propose an automatic labeling method that avoids the need for domain experts to label thousands of training examples. We performed multiple experiments to evaluate the ability of our model to assess context-dependent image quality, the efficiency a user might gain when incorporating our model, and the uncertainty of the model given different types of input images. We compare our approach to the state of the art in no-reference image quality assessment. Our model correlates well with the perceptions of scientists assessing context-dependent image quality and could result in significant time savings when included in the current Mastcam image review process.

Recommended citation: Kerner, H. R., Bell III, J. F., Ben Amor, H. (2018). "Context-Dependent Image Quality Assessment of JPEG-Compressed Mars Science Laboratory Mastcam Images using Convolutional Neural Networks." Computers and Geosciences. 118(2018) pp. 109-121.