Posts by Collection

portfolio

publications

Context-Dependent Image Quality Assessment of JPEG-Compressed Mars Science Laboratory Mastcam Images using Convolutional Neural Networks

Computers and Geosciences, 2018

The Mastcam color imaging system on the Mars Science Laboratory Curiosity rover acquires images that are often JPEG compressed before being downlinked to Earth. Depending on the context of the observation, this compression can result in image artifacts that might introduce problems in the scientific interpretation of the data and might require the image to be retransmitted losslessly. We propose to streamline the tedious process of manually analyzing images using context-dependent image quality assessment, a process wherein the context and intent behind the image observation determine the acceptable image quality threshold. We propose a neural network solution for estimating the probability that a Mastcam user would find the quality of a compressed image acceptable for science analysis. We also propose an automatic labeling method that avoids the need for domain experts to label thousands of training examples. We performed multiple experiments to evaluate the ability of our model to assess context-dependent image quality, the efficiency a user might gain when incorporating our model, and the uncertainty of the model given different types of input images. We compare our approach to the state of the art in no-reference image quality assessment. Our model correlates well with the perceptions of scientists assessing context-dependent image quality and could result in significant time savings when included in the current Mastcam image review process.

Recommended citation: Kerner, H. R., Bell III, J. F., Ben Amor, H. (2018). "Context-Dependent Image Quality Assessment of JPEG-Compressed Mars Science Laboratory Mastcam Images using Convolutional Neural Networks." Computers and Geosciences. 118(2018) pp. 109-121.

Novelty Detection for Multispectral Images with Application to Planetary Exploration

Innovative Applications of Artificial Intelligence (IAAI/AAAI), 2019

In this work, we present a system based on convolutional autoencoders for detecting novel features in multispectral images. We introduce SAMMIE: Selections based on Autoencoder Modeling of Multispectral Image Expectations. Previous work using autoencoders employed the scalar reconstruction error to classify new images as novel or typical. We show that a spatial-spectral error map can enable both accurate classification of novelty in multispectral images as well as human-comprehensible explanations of the detection. We apply our methodology to the detection of novel geologic features in multispectral images of the Martian surface collected by the Mastcam imaging system on the Mars Science Laboratory Curiosity rover.

Recommended citation: Kerner, H. R., Wellington, D. F., Wagstaff, K. L., Bell III, J. F., Ben Amor, H. (2019). "Novelty Detection for Multispectral Images with Application to Planetary Exploration." In Proc. of Innovative Applications of Artificial Intelligence (IAAI/AAAI) 2019.

Deep Learning Methods Toward Generalized Change Detection on Planetary Surfaces

In review, 2019

Ongoing Mars exploration missions are returning large volumes of image data. Identifying surface changes in these images, e.g., new impact craters, is critical for investigating many scientific hypotheses. Traditional approaches to change detection rely on image differencing and manual feature engineering. These methods can be sensitive to irrelevant variations in illumination or image quality and typically require before and after images to be co-registered, which itself is a major challenge. To overcome these limitations, we propose novel deep learning approaches to change detection that rely on transfer learning, convolutional autoencoders, and Siamese networks. Our experiments show that these approaches can detect meaningful changes with high accuracy despite significant differences in illumination, image quality, imaging sensors, and alignment between before and after images. We show that the latent representations learned by a convolutional autoencoder yield the most general representations for detecting change across surface feature types, scales, sensors, and planetary bodies.

Recommended citation: Kerner, H. R., Wagstaff, K. L., Bue, B., Gray, P. C., Bell III, J. F., Ben Amor, H. (2019). "Deep Learning Methods Toward Generalized Change Detection on Planetary Surfaces." In Review.

talks

teaching

Science Olympiad: Reach for the Stars

Outreach, North Carolina Science Olympiad, Phillips Middle School, 2012

I coached a middle school Science Olympiad team at Phillips Middle School for two years in the “Reach for the Stars” event, which covered fundamentals of stars and galaxies. You can find resources on this event here.

COMP 110: Introduction to Programming (using Java)

Undergraduate course, UNC Chapel Hill, Department of Computer Science, 2014

I was a teaching assistant for this introductory computer programming course for students with no prior experience in computer science. We used the Java programming language. You can find the course description and syllabus here.

Introduction to Scientific Programming

Undergraduate course, UNC Chapel Hill, Department of Computer Science, 2014

I was a teaching assistant for this introductory scientific programming course for students with no prior experience in programming. We used MATLAB. You can find the course description and syllabus here.

COMP 190: Computer Science for People Who Do Not Know CS (Yet)

Undergraduate course, UNC Chapel Hill, Department of Computer Science, 2015

I designed and taught this course in an effort to introduce the beauty and joy of computer science to students who might not have considered majoring in CS due to a lack of experience or confidence. This course was inspired by the diversity and inclusion efforts of UC Berkeley and Harvey Mudd College. Dr. Kevin Jeffay mentored me throughout this process. I taught introductory concepts for several areas of computer science including artificial intelligence and object-oriented programming. This course had a final project at the end in which students worked in small groups to design a tool of their choosing using at least one external python library. You can view the syllabus here.

Girls Who Code

Outreach, Girls Who Code, Maie B. Heard Elementary School, 2016

Girls Who Code is an organization that facilitates free summer and after-school programs to teach girls how to code and about career paths in computer science. I have taught 6-8th grade girls at Maie B. Heard Elementary in Phoenix, AZ since 2016. In addition to programming exercises, the students work on an impact project of their choice and design each year. Here is one of their projects, in which they sought to improve the quality and nutrition of the lunches at their school.

ASU Prison Education Program

Outreach, Arizona State University, Prison Education Program, 2018

The Prison Education Program at ASU is an education program in which ASU students and staff teach courses to incarcerated adults and juveniles. I help teach Algebra 1A and GED math at the Adobe Mountain School, a juvenile detection facility in Deer Valley, AZ. Additionally, I am helping to design and teach the planetary exploration unit in our new Earth and Space Exploration course at Eyman Prison in Florence, AZ.