This page was generated from doc/tutorials/hough_indexing.ipynb. Interactive online version: Binder badge.

Hough indexing#

In this tutorial, we will perform Hough indexing (HI) using PyEBSDIndex. We will use a tiny 13 MB dataset of nickel available in kikuchipy.

Note

kikuchipy cannot depend on PyEBSDIndex at the moment, as PyEBSDIndex does not support all the combinations of Python versions and operating systems that kikuchipy does. To install PyEBSDIndex, see their installation instructions.

PyEBSDIndex supports indexing face centered and body centered cubic (FCC and BCC) materials.

Let’s import necessary libraries

[1]:
# Exchange inline for notebook or qt5 (from pyqt) for interactive plotting
%matplotlib inline

from diffpy.structure import Atom, Lattice, Structure
from diffsims.crystallography import ReciprocalLatticeVector
import kikuchipy as kp
import matplotlib.pyplot as plt
import numpy as np
from orix import plot
from orix.crystal_map import create_coordinate_arrays, CrystalMap, Phase, PhaseList
from orix.quaternion import Rotation
from orix.vector import Vector3d
from pyebsdindex import ebsd_index, pcopt


plt.rcParams.update(
    {"font.size": 15, "lines.markersize": 15, "scatter.edgecolors": "k"}
)

Load a dataset of (75, 55) nickel EBSD patterns of (60, 60) pixels with a step size of 1.5 μm

[2]:
s = kp.data.nickel_ebsd_large(allow_download=True)
s
2022-10-03 08:30:57,783 - hyperspy.extensions - INFO - Enabling extension kikuchipy
[2]:
<EBSD, title: patterns Scan 1, dimensions: (75, 55|60, 60)>

Pre-indexing maps#

First, we produce two indexing independent maps showing microstructural features: a virtual backscatter electron (VBSE) image and an image quality (IQ) map. The former uses the BSE yield on the detector to give a qualitative orientation contrast, so is done on raw unprocessed patterns. The latter assumes that the sharper the Kikuchi bands, the higher the image quality, so is done on processed patterns.

[3]:
(5, 5)

Get the VBSE image by coloring the three center grid tiles red, green and blue

[4]:
maps_vbse_rgb = vbse_gen.get_rgb_image(r=(2, 1), g=(2, 2), b=(2, 3))
maps_vbse_rgb
[4]:
<VirtualBSEImage, title: , dimensions: (|75, 55)>

Plot the VBSE image

../_images/tutorials_hough_indexing_10_0.png

We see that we have 20-30 grains, many of them apparently twinned.

Enhance the Kikuchi bands by removing the static and dynamic background (see the pattern processing tutorial for details)

[6]:
s.remove_static_background()
s.remove_dynamic_background()
[########################################] | 100% Completed | 205.71 ms
[########################################] | 100% Completed | 815.03 ms

Get the IQ map

[7]:
maps_iq = s.get_image_quality()
[########################################] | 100% Completed | 417.11 ms

Plot the IQ map

[8]:
fig, ax = plt.subplots()
ax.imshow(maps_iq, cmap="gray");
../_images/tutorials_hough_indexing_16_0.png

We recognize the grain and (presumably) twinning boundaries from the VBSE image, and also some dark lines, e.g. to the lower and upper left, which look like scratches on the sample surface.

Calibrate detector-sample geometry#

We need to know the position of the sample with respect to the detector, the so-called projection/pattern center (PC) (see the reference frames tutorial for all conventions). We do this by optimizing an initial guess of the PC obtained from similar experiments on the same microscope.

Create an indexer, specifying the sample and camera tilts

[9]:
sig_shape = s.axes_manager.signal_shape[::-1]
indexer = ebsd_index.EBSDIndexer(
    phaselist=["FCC"],  # FCC, BCC or both
    vendor="KIKUCHIPY",
    sampleTilt=70,
    camElev=0,
    patDim=sig_shape,
)

Optimize PC using some patterns from the dataset spread out evenly in a map grid

[10]:
pc_grid_shape = (4, 5)

# Determine appropriate 2D indices of the patterns in the EBSD map
nav_shape = s.axes_manager.navigation_shape[::-1]
steps = np.ceil(np.array(nav_shape) / (np.array(pc_grid_shape) + 1)).astype(int)
idx_1d_all = np.arange(np.prod(nav_shape)).reshape(nav_shape)
idx_1d = idx_1d_all[:: steps[0], :: steps[1]][1:, 1:]
idx_2d = np.stack(np.unravel_index(idx_1d, nav_shape))
subtract_to_center = (idx_2d[:, 0, 0] - (nav_shape - idx_2d[:, -1, -1])) // 2
idx_2d[0] -= subtract_to_center[0]
idx_2d[1] -= subtract_to_center[1]

Plot the PC grid on the IQ map

[11]:
pc_color = np.arange(np.prod(pc_grid_shape))

fig, ax = plt.subplots()
ax.imshow(maps_iq, cmap="gray")
ax.scatter(*idx_2d[::-1], c=pc_color);
../_images/tutorials_hough_indexing_23_0.png

Get patterns to optimize with PC with

[12]:
patterns_pc = s.data[tuple(idx_2d)].reshape(-1, *sig_shape)

Optimize PCs individually

[13]:
pc0 = (0.4, 0.2, 0.5)
pcs = np.zeros((patterns_pc.shape[0], 3))
for i in range(patterns_pc.shape[0]):
    pcs[i] = pcopt.optimize(patterns_pc[i], indexer, pc0)

# Use instead once PyEBSDIndex v0.1.1 is released
# pcs = pcopt.optimize(patterns_pc, indexer, pc0, batch=True)

Plot the PC values

[14]:
fig, (ax0, ax1, ax2) = plt.subplots(ncols=3, figsize=(15, 5))
ax0.scatter(pcs[:, 0], pcs[:, 1], c=pc_color)
ax0.set_xlabel("PCx")
ax0.set_ylabel("PCy")
ax1.scatter(pcs[:, 0], pcs[:, 2], c=pc_color)
ax1.set_xlabel("PCx")
ax1.set_ylabel("PCz")
ax2.scatter(pcs[:, 2], pcs[:, 1], c=pc_color)
ax2.set_xlabel("PCz")
ax2.set_ylabel("PCy")
fig.tight_layout()
../_images/tutorials_hough_indexing_29_0.png

The values are not ordered nicely in the initial map grid, but that is to be expected with these highly binned (60, 60) pixel patterns. The values do not vary much at this magnification so we’ll use the average PC for indexing. We can plot the PC on a pattern using the EBSDDetector

[15]:
detector = kp.detectors.EBSDDetector(shape=sig_shape, pc=pcs.mean(0), sample_tilt=70)
detector
[15]:
EBSDDetector (60, 60), px_size 1 um, binning 1, tilt 0, azimuthal 0, pc (0.422, 0.215, 0.502)
[16]:
detector.plot(pattern=patterns_pc[0])
../_images/tutorials_hough_indexing_32_0.png

Perform indexing#

Index the patterns using the average PC, also plotting the Hough transform and the nine detected bands used in indexing of the first pattern

[17]:
data, *_ = indexer.index_pats(s.data.reshape(-1, *sig_shape), PC=detector.pc, verbose=2)
Radon Time: 9.302672257999802
Convolution Time: 3.975002367000343
Peak ID Time: 3.174084916998254
Band Label Time: 1.0171890359979443
Total Band Find Time: 17.469279072000063
../_images/tutorials_hough_indexing_34_1.png
Band Vote Time:  2.961466962000486

Generate a CrystalMap for easy saving and analysis of the indexing results (see the PyEBSDIndex Hough indexing tutorial for a complete explanation of all the indexing result parameters)

[18]:
# Generate CrystalMap (should make a convenience function in orix for this!)
xy, _ = create_coordinate_arrays(
    nav_shape, step_sizes=(s.axes_manager["y"].scale, s.axes_manager["x"].scale)
)
xmap = CrystalMap(
    rotations=Rotation(data[-1]["quat"]),
    x=xy["x"],
    y=xy["y"],
    phase_list=PhaseList(
        Phase(
            name="ni",
            space_group=225,
            structure=Structure(
                lattice=Lattice(0.35236, 0.35236, 0.35236, 90, 90, 90),
                atoms=[Atom("Ni", [0, 0, 0])],
            ),
        )
    ),
    prop=dict(
        pq=data[-1]["pq"],  # Pattern quality
        cm=data[-1]["cm"],  # Confidence metric
        fit=data[-1]["fit"],  # Pattern fit
        nmatch=data[-1]["nmatch"],  # Number of detected bands matched
        iq=maps_iq.ravel(),
    ),
    scan_unit="um",
)

xmap
[18]:
Phase   Orientations  Name  Space group  Point group  Proper point group     Color
    0  4125 (100.0%)    ni        Fm-3m         m-3m                 432  tab:blue
Properties: pq, cm, fit, nmatch, iq
Scan unit: um

Analyze indexing results#

Plot quality metrics

[19]:
fig, ax = plt.subplots(nrows=2, ncols=2, figsize=(10, 5.5))
for a, to_plot in zip(ax.ravel(), ["pq", "cm", "fit", "nmatch"]):
    im = a.imshow(xmap.get_map_data(to_plot))
    fig.colorbar(im, ax=a, label=to_plot)
    a.axis("off")
fig.tight_layout(pad=0.5)
../_images/tutorials_hough_indexing_39_0.png

The pattern quality (PQ) and confidence metric (CM) maps show little variation across the sample. The most important map here is the pattern fit (also known as the mean angular error/deviation), which shows the average angular deviation between the positions of each detected band to the closest theoretical band: this is below an OK fit of 1.5\(^{\circ}\) across most of the map. The final map (nmatch) shows that most of the nine detected bands in each pattern were indexed within a pattern fit of 3\(^{\circ}\).

Create a color key to color orientations with

[20]:
ckey = plot.IPFColorKeyTSL(xmap.phases[0].point_group)
ckey.plot()
../_images/tutorials_hough_indexing_41_0.png

Orientations are given a color based on which crystal direction \(\left<uvw\right>\) points in a certain sample direction, producing the so-called inverse pole figure (IPF) map. Let’s plot the IPF-Z map with the CM map overlayed

[21]:
xmap.plot(ckey.orientation2color(xmap.rotations), overlay="cm", remove_padding=True)
../_images/tutorials_hough_indexing_43_0.png

Let’s also plot the three maps side by side

[22]:
directions = Vector3d(((1, 0, 0), (0, 1, 0), (0, 0, 1)))
n = directions.size

fig, ax = plt.subplots(ncols=n, figsize=(7 * n, 8))
for i, title in zip(range(n), ["X", "Y", "Z"]):
    ckey.direction = directions[i]
    rgb = ckey.orientation2color(xmap.rotations)
    ax[i].imshow(rgb.reshape(xmap.shape + (3,)))
    ax[i].set_title(f"IPF-{title}")
    ax[i].axis("off")
fig.tight_layout()
../_images/tutorials_hough_indexing_45_0.png

The orientation maps show grains and twins as we would expect from the VBSE image and IQ map obtained before indexing.

As a final verification, we’ll plot geometrical simulations on top of the experimental patterns (see the geometrical simulations tutorial for details)

[23]:
ref = ReciprocalLatticeVector(
    phase=xmap.phases[0], hkl=[[1, 1, 1], [2, 0, 0], [2, 2, 0], [3, 1, 1]]
)
ref = ref.symmetrise()
simulator = kp.simulations.KikuchiPatternSimulator(ref)
sim = simulator.on_detector(detector, xmap.rotations.reshape(*xmap.shape))
Finding bands that are in some pattern:
[########################################] | 100% Completed | 101.91 ms
Finding zone axes that are in some pattern:
[########################################] | 100% Completed | 102.05 ms
Calculating detector coordinates for bands and zone axes:
[########################################] | 100% Completed | 102.09 ms

Add markers to EBSD signal

[24]:
markers = sim.as_markers()
s.add_marker(markers, plot_marker=False, permanent=True)

Navigate patterns with simulations in IPF-Z map (see the visualization tutorial for details)

[25]:
../_images/tutorials_hough_indexing_52_0.png
../_images/tutorials_hough_indexing_52_1.png

We can refine the orientation results using dynamical simulations. See the refinement section of the pattern matching tutorial for how to do that.