Search papers, labs, and topics across Lattice.
This paper introduces a dataset of 7,330 HAADF-STEM images paired with acquisition metadata to learn a joint embedding space. By linking image style to instrument parameters, they train a generative style transfer network to modify the style of experimental images based on metadata. The trained network demonstrates utility in physical denoising of materials TEM images.
Unlock the secrets hidden in your lab's backed-up microscopy data: style transfer networks can now "re-imagine" images as if they were captured with different instrument settings.
The vast majority of transmission electron microscopy (TEM) data never gets published and ends up on a backup drive until deleted to free up space. These left-over datasets are rich in detail and variation, often paired with automatically saved metadata of instrument state and acquisition parameters. In this work, we introduce a dataset of 7,330 high-angle annular dark-field scanning-TEM (HAADF-STEM) images from a single instrument to learn a joint embedding space between image metadata and HAADF image. These embeddings link image style with acquisition parameters, which allows us to train a generative style transfer network that can convert experimental images into the style they would have had if they were recorded with different instrument parameters. We evaluate the performance of the network and explore the usefulness of the technique for physical denoising.