1 Introduction

Analysis of point defects in graphene using low dose scanning transmission electron microscopy imaging and maximum likelihood reconstruction


Christian Kramberger1, Andreas Mittelberger 1,
Christoph Hofer 1, Jannik C. Meyer1

1 Faculty of Physics, University of Vienna, Boltzmanngasse 5, 1090 Vienna, Austria

\justify

abstract: Freestanding graphene displays an outstanding resilience to electron irradiation at low electron energies. Point defects in graphene are, however, subject to beam driven dynamics. This means that high resolution micrographs of point defects, which usually require a high electron irradiation dose might not represent the intrinsic defect population. Here, we capture the inital defects formed by ejecting carbon atoms under electron irradiation, by imaging with very low doses and subsequent reconstruction of the frequently occuring defects via a maximum likelihood algorithm.


date: July 2, 2019

1 Introduction

With the advent of aberration correction [1, 2], electron microscopy has entered the realm of atomic resolution for light elements [3, 4, 5]. While this in principle would enable studies on individual organic molecules, beam damage is posing a serious challenge in this field [6, 7]. The structures of interest are destroyed or altered by the probing electron beam, before they could possibly be imaged with satisfactory statistics. A prime example of dynamic entities under electron irradiation are vacancy defects in graphene. Defects in graphene are typically more beam-stable than organic molecules, nevertheless they change configuration under the typical doses that are needed for a high resolution image [8, 9]. To date, the statistics on divacancy reconstructions have shown a non-thermal inverted population for the three basic divacancies with 0, 3 and 4 pairs of a 5 and a 7 membered ring [10, 11]. Here we demonstrate the application of the maximum likelihood reconstruction [12, 13] to very low-dose micrographs containing sparse and not directly visible dynamically created point defects. The population of point defects is consistent with single and double atom ejection events dominating over sequential Stone-Wales transformations. Only one type of divacancy is observed while the other divacancies appear to be absent.

2 Overview

This paper demonstrates the recovery of point-defect images in graphene based on scanning transmission electron microscopy (STEM) data recorded at 1-2 orders of magnitude lower doses than typically used for direct imaging of the atomic structure. In similarity to cryo-electron microscopic studies of organic biological molecules [14, 15], we aim to distribute the dose over many identical copies of an object, and to retrieve the equivalent of a high-dose image via a suited reconstruction algorithm. However, in contrast to large organic molecules, the location of individual point defects (a few missing atoms) can not be discerned in the individual low-dose exposures (e.g. Figs. 1&2). Hence, any algorithm that requires to locate and classify individual objects prior to averaging (as is common in the single-particle analysis of biological molecules) will not work for this purpose.

To solve this problem, we have developed a new algorithm that can recover repeatedly occuring deviations from the periodic lattice even if they can not be located in the noisy images [12, 13]. The algorithm employs a maximum-likelihood approach where a set of model images is iteratively optimized so that the models predict the experimental data as well as possible. Apart from the underlying periodic lattice and an approximately expected lateral extension of the defect structures, no a-priori assumptions on the defect structures are required. Ref. [12] contained a general description of the ideas in context with other literature, and a first proof of concept based on simulations. In Ref. [13], we explored the limitations of the new approach, and optimized the implementation to be applicable with real experimental data (shown with high-dose images that were artificially resampled to mimic low-dose conditions). In order to obtain true low-dose images, pristine areas of the sample have to imaged, without pre-exposing the respective sample area for focusing: This is achieved by an automated low-dose acquisition scheme, where focus references are taken at the corners of the region of interest and the focus is then interpolated (this is reported elsewhere [16]).

The present paper shows a reconstruction from true low-dose data, expanding on those aspects that are important with real experimental data (and have not been covered in previous simulation-based works) along with a discussion of the findings. In short, these aspects are as follows: After data acquisition, the underlying periodic lattice of all micrographs has to be un-distorted and brought into registry. Then the probability distribution function (PDF) can be collected for different expectation values. With this information the likelihood that the entire experimental dataset is an actual observation of any composition of a finite set of small repeating patches can be calculated. The expression of the likelihood is given in Eqs. 1&2 and the maximum likelihood algorithm optimizes the likelihood value by iteratively adjusting a set of model images. In the end, effective high-dose images, as shown in Figure 5, of the most frequently occuring defects are obtained, along with relative weights representing their density.

3 Experimental

All raw STEM images were taken at a Nion Ultra STEM 100, operated at 100 kV, and with the medium-angle annular dark field (MAADF) detector spanning a range of ca. 60-200 mrad. We used a commercial graphenea® specimen of graphene on a quantifoil® TEM grid. The 12 nm field of view was scanned with 2048x2048 pixels at a dwell time of only  s. We employed a user written extension of the microscope control software (Nion Swift) to facilitate the task of automatically mapping areas on the order of [16], without prior exposure of the pristine sample for focusing or stigmation. For the experiment reported here, the microscope was intentionally operated at 100 kV, so that knock-on damage leads to the formation of vacancies [17]. We also intentionally obtained multiple micrographs on each sample location, so as to obtain beam-induced defects in subsequent exposures. The acquisitions were repeated until either the clean graphene coverage dropped below 30%, which typically happened after 30 frames, or to a maximum of 80 frames. A total of 1187 frames with a 12 nm field of view were recorded, each with a dose of Å. The post-processed signal to noise ratio of  dB is measured in accordance to Ref. [13] as the contrast of the graphene lattice compared to the noise level. Examples are shown in Fig. 1. Per frame, this dose is more than 300 times lower than the dose reported in Ref. [5] for MAADF imaging of single-layer light element samples, and even the cumulative dose of 30-80 exposures is less than the typical dose for an individual, high signal to noise ratio MAADF image of graphene. In this context we also point out again that each acquisition series starts from zero exposure, after focusing elsewhere, so that we record a signal starting with the first electron on a fresh spot of the sample. This is in contrast to conventional, manual imaging of materials, where typically a region of interest is exposed to high doses before even capturing any data.

Figure 1: top grid: Low dose snap-shots of consecutive, spatially separated sites. bottom grid: The first and then every tenth low-dose aquisition at the same site. A dynamic hole forms at the central beam parking position. The lattice is barely visible. Point defects are not directly visible. Contaminated areas (red) and holes (cyan) are excluded from data analysis.

4 Data processing and analysis

As first step of the data processing, areas containing bulk contaminations or holes in the graphene are masked out by a thresholded median filter followed by a dilation step (an example of masked out areas is given in the lower grid in Fig. 1). The micrographs were also median filtered with a 3x3 kernel, to eliminate outliers.

The next step is to identify the graphene lattice in every low-dose micrograph with atomic precision. In total, we consider six parameters that allow us to bring the lattices of all exposures into registry: The first two are the direction and scaling of a linear distortion, the next two parameters describe the rotation and lattice constant of the graphene lattice, and lastly, two cartesian coordinates are used to define a translational reference point.

Figure 2: left panel: A representative low-dose micrograph. right panel: Central region of the power spectrum of the Fourier transform with elliptical fits for the first and second order spots.

Figure 2 shows a representative low-dose scanning transmission electron microscopy (STEM) micrograph of an (indiscernible) clean freestanding graphene sheet. The most straightforward way to identify the lattice for the alignment of low-dose micrographs is the power spectrum of the discrete Fourier transform (FT). Its central part is also displayed in Fig. 2. The bright spots in the power spectrum are the fingerprint of the graphene lattice. The elliptic fits contain convoluted information about the scanning distortions and sample tilts. The first four parameters for the hexagonal sampling are initialized from elliptical fits like the one in Fig. 2. We then optimize all six parameters using a comparison in real space, which does not imply periodic boundary conditions and can also be applied to irregularly shaped regions of clean graphene. For this purpose, a simple coordinate descent along the six parameters is used.

The data is then resampled onto hexagonal pixels, as described in full detail earlier [13]. The symmetry of hexagonal pixels as well as hexagonal unitcells allows to represent all point symmetries (rotations and mirrors) of the graphene lattice without any interpolation. We use a sampling density of 12 hexagons per carbon-carbon bondlength for the averaged unitcells during image alignment optimization. The final data for the maximum-likelihood reconstruction is then obtained by another re-sampling with 4 hexagons per C-C bondlength. During resampling, we also correct the sample tilts, scanning distortions and scaling. For the re-sampling to hexagonal pixels and also for the removal of outliers via median filtering as described above, it is useful that the initial data is highly oversampled. The hexagonal sampling results in a pixelarea of  Å, well suited for atomic resolution. Figure 3 shows the graphene lattice in the average of all aligned images. This averaging does reveal the central parking spot of the electron beam, but it cannot reveal any point defects.

Figure 3: Overlay of 1187 aligned and resampled micrographs. The long range contrast variations arise from masking out irregular holes and areas with thick contaminations.

Besides well aligned input data, the maximum likelihood algorithm requires a noise model, i.e., an expression for the probability of measuring a value when assuming a mean value . Since the experimental detector response is difficult to model, we extract the PDF directly from the experimental data. For this purpose, the extracted data is grouped into bins by the absolute contrast (as defined by the standard deviation) in the translationally averaged unitcell. A logarithmic grouping with 10-15% wide bins is found to be adequate for typical low-dose noise levels.

Figure 4: top panel: linear correlation between the expectation value and the standard deviation in equivalent pixels. bottom panel: The actual histogramm (black circles) of observed values on equivalent pixels for a typical expectation value of 512. The brightness in noise free graphene varies between 441 and 608 (dashed verticals). The histogramm is approximated by a Gamma distribution (red line).

The panels in Fig. 4 do establish two crucial empirical facts. Firstly, there is a linear correlation between the expectation value and the standard deviation of a pixel, and secondly the empirical PDF can be represented by a Gamma distribution. This is in striking contrast to the ideal signal that should consist of Poisson distributed integer scattering events per pixel. We attribute this finding to the strong influence of the signal processing hardware (scintillator, photomultiplier, electronic amplifier and analog to digital conversion, etc) on the resulting noise. Regardless of its origin, the experimentally obtained noise model allows to represent the PDF with a smooth analytic function that can be extended for expectation values below and above the contrast regime of clean graphene.

The experimental images were split into smaller hexagonal pieces (supercells) with nm each. This size is chosen to be just large enough to contain our expected defects. [13] Also, the supercells are allowed to overlap by half of their diameter, to ensure that every point defect is captured completely at least once. A total of supercells were extracted from the data. With these supercells and the experimentally obtained PDFs, the likelihood of a basis set of model images to account for the entirity of the measured data can be directly calculated as [12, 13]:

(1)
(2)

Here the index denotes the different extracted hexagonal supercells, or frames for short; runs through the different models in the basis; denotes the space and pointgroup symmetry equivalent configurations of the frame, and runs over all pixels within the models or frames. With the measure of the models and their relative weights can be varied to find the most likely explanation for the observed data. The probability to observe a value for a given expectation value in a model depends on the contrast group of the frame . Trials for changed weights or pixel values can be rejected or accepted to increase the likelihood .

The likelihood maximization is initialized with a set of four identical models representing the empty lattice (top row in Fig. 5). Upon maximization one of them quickly converges to the model in the second row in Fig. 5. After duplicating all models and further optimization of the likelihood one of the descendands of the model in the second row morphes into the one in the third row. The alternating steps of optimization and cloning can be repeated until no more new defects are found. The selection presented here focuses on the direct predecessors of the 4 fundamentally different vacancy defects and disregards the visually similar clones and empty lattices, naturally occuring from this scheme. The red arrows trace the direct lines of descendance. Blue dots mark the roots of entire branches, not included in the selection. The collected weights over equivalent appearances of the four archetypical defects are from left to right 10.4%, 0.5%, 1.8% and 1.7%. These frequencies correspond approximately to the occurances per model or 3.35 nm.

Figure 5: The four initial models are in the top row. Descendants are connected with red arrows. Blue dots indicate hidden branches of equivalent models. The bottom row shows a selection from a total of 128 descendants. The weights are from left to right 10.4%, 0.5%, 1.8% and 1.7%.

5 Discussion of defect structures

It is interesting to note that the only divacancy we find in the maximum likelihood reconstruction is the 5-8-5 defect with a weight of % while the other divacancies (those labeled 555777 and 555567777 in Ref. [18]) are absent. From the lowest confirmed detection we can conclude that the 5-8-5 defect dominates over the other divacancies by at least a factor of 20. This is very different from the more balanced divacancy populations reported from lower voltage and much higher dose aquisition series of one single divacancy [10, 11] (note also that in the earlier work in Ref. [13], where we used conventional, high-dose experimental data that was artificially resampled to mimic low-dose conditions, the 555777 and 555567777 defects could be recovered).

The set of models follows the pattern from the most simple divacancy to paired divacancies and also a five fold vacancy with a bridging atom. This series can be rationalized considering that the primary dynamics is effectively described by the e-beam removing entire bonds [19]: After the first atom was kicked out, it becomes very likely to immediately remove one of its neighbors. The absence of single vacancies as well as any other more complex divacancy configurations in Fig. 5 suggests that under irradiation with 100 keV electrons, the removal of an entire carbon dimer dominates, while the series of Stone-Wales transformations that would be necessary to form vacancies with three or four pairs of a 5 and a 7 membered ring [8, 18] occur only under much higher doses of irradiation. The model with an bridging atom [20] can be obtained by kicking out one of the atoms of the central square in the model. The exclusive observation of this specific high symmetry bridge defect suggests that the ejection of the atom does lead to a relatively long lived structure. Single vacancies were not found in this study, but we point out that they are also only very rarely seen in higher-dose TEM or STEM studies of graphene.

The lowest demonstrated detection is the one of the 5-fold bridge vacancy with a weight of only %, corresponding to a total of occurences or a density of one defect per . The 585 di-vacancy with a weight of %, in contrast, should have occurences in the data or a density of one defect per . At a S/N of -6.9 dB roughly 100 cases or % would be needed to obtain an satisfactory S/N of  dB. The algorithm was shown to succeed on extremely low signal levels of  dB with simulated data[13]. However, for the practical implementation the data processing does require the lattice spots to be detectable in the Fourier transform, which will impose another S/N limit (dependent on the field of view per exposure). We point out that there is no sharp detection limit in terms of the required defect density, but rather, whether a structure can be recovered depends also the amount of available data, its S/N, and the size and contrast of a feature.

6 Conclusions

We have demonstrated that the retrieval of effective high dose images is feasible for sizeable sets of low-dose annular dark field micrographs. In the current case the individual exposures have an electron dose of Å and a signal to noise ratio after resampling to the targeted resolution of dB. Under these conditions the reconstruction algorithm is successful in independently identifying one missing pair of carbon atoms and the surrounding lattice relaxation as a reoccuring feature in the graphene lattice. The algorithm can also discriminate paired missing bonds as well as a five fold bridge-vacancy. Notably, the populations of extended di-vacancy reconstructions with 3 or 4 pairs of a 5 and a 7 membered ring are found to be below the confirmed detection threshold of 500 occurences and are at least 20 times less frequent. The set of the most frequently occuring vacancy defects can be rationalized in terms of faster atom and atom pair removal by a 100 keV electron beam as compared to the sequential Stone- Wales transformations, that would be required to form the more extended divacancies. The low-dose imaging approach combined with the maximum likelihood reconstruction holds a promise for studying native defects in a material while minimizing changes in the structure caused by the electron irradiation. The successful image recovery of as little as a few (missing) carbon atoms is a promising sign for more challenging specimen, such as molecules deposited on graphene, or functional groups attached to it.

acknowledgement

We acknowledge support from the European Research Council (ERC) Project No. 336453-PICOMAT.

References

  • [1] M. Haider, S. Uhlemann, E. Schwan, H. Rose, B. Kabius, and K. Urban, Nature 392(6678), 768–769 (1998).
  • [2] O. Krivanek, Ultramicroscopy 78(jun), 1–11 (1999).
  • [3] K. Suenaga, H. Wakabayashi, M. Koshino, Y. Sato, K. Urita, and S. Iijima, Nature nanotechnology 2(6), 358–60 (2007).
  • [4] J. C. Meyer, C. Kisielowski, R. Erni, M. D. Rossell, M. F. Crommie, and A. Zettl, Nano letters 8(11), 3582–6 (2008).
  • [5] O. L. Krivanek, M. F. Chisholm, V. Nicolosi, T. J. Pennycook, G. J. Corbin, N. Dellby, M. F. Murfitt, C. S. Own, Z. S. Szilagyi, M. P. Oxley, S. T. Pantelides, and S. J. Pennycook, Nature 464(7288), 571–4 (2010).
  • [6] R. F. Egerton, Microscopy research and technique 75(11), 1550–6 (2012).
  • [7] R. Hovden and D. a. Muller, Ultramicroscopy 123(dec), 59–65 (2012).
  • [8] J. Kotakoski, A. Krasheninnikov, U. Kaiser, and J. Meyer, Phys. Rev. Lett. 106(10), 1–4 (2011).
  • [9] J. H. Warner, E. R. Margine, M. Mukai, a. W. Robertson, F. Giustino, and a. I. Kirkland, Science 337(6091), 209–212 (2012).
  • [10] P. Börner, U. Kaiser, and O. Lehtinen, Phys. Rev. B 93(Apr), 134104 (2016).
  • [11] J. Kotakoski, C. Mangler, and J. C. Meyer, Nature communications 5 (2014).
  • [12] J. C. Meyer, J. Kotakoski, and C. Mangler, Ultramicroscopy 145, 13–21 (2014).
  • [13] C. Kramberger and J. C. Meyer, Ultramicroscopy 170(November), 60–68 (2016).
  • [14] J. Frank, Three-dimensional electron microscopy of macromolecular assemblies: visualization of biological molecules in their native state (Oxford University Press, 2006).
  • [15] Z. H. Zhou, Current opinion in structural biology 18(2), 218–28 (2008).
  • [16] A. Mittelberger, C. Kramberger, C. Hofer, C. Mangler, and J. C. Meyer, Microscopy and Microanalysis (2017).
  • [17] J. Meyer, F. Eder, S. Kurasch, V. Skakalova, J. Kotakoski, H. Park, S. Roth, A. Chuvilin, S. Eyhusen, G. Benner, A. Krasheninnikov, and U. Kaiser, Phys. Rev. Lett. 108(19), 196102 (2012).
  • [18] J. Kotakoski, J. Meyer, S. Kurasch, D. Santos-Cottin, U. Kaiser, and A. Krasheninnikov, Physical Review B 83(24) (2011).
  • [19] J. Kotakoski and A. Krasheninnikov, Native and irradiation-induced defects in graphene: what can we learn from atomistic simulations? , in: Computational Nanoscience, edited by E. Bichoutskaia, (Royal Society of Chemistry, 2011).
  • [20] A. W. Robertson, G. D. Lee, K. He, E. Yoon, A. I. Kirkland, and J. H. Warner, Nano Letters 14(7), 3972–3980 (2014), PMID: 24959991.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
363395
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description