This page was generated from doc/tutorials/pc_calibration_moving_screen_technique.ipynb. Interactive online version: .

# PC calibration: “moving-screen” technique#

The gnomonic projection (pattern) center (PC) of an EBSD detector can be estimated by the “moving-screen” technique , which we will test in this tutorial.

The technique relies on the assumption that the beam normal, shown in the top figure (d) in the reference frames tutorial, is normal to the detector screen as well as the incoming electron beam, and will therefore intersect the screen at a position independent of the detector distance (DD). To find this position, we need two EBSD patterns acquired with a stationary beam but with a known difference $$\Delta z$$ in DD, say 5 mm.

First, the goal is to find the pattern position which does not shift between the two camera positions, ($$PC_x$$, $$PC_y$$). This point can be estimated in fractions of screen width and height, respectively, by selecting the same pattern features in both patterns. The two points of each pattern feature can then be used to form a straight line, and two or more such lines should intersect at ($$PC_x$$, $$PC_y$$).

Second, the DD ($$PC_z$$) can be estimated from the same points. After finding the distances $$L_{in}$$ and $$L_{out}$$ between two points (features) in both patterns (in = operating position, out = 5 mm from operating position), the DD can be found from the relation

$\mathrm{DD} = \frac{\Delta z}{L_{out}/L_{in} - 1},$

where DD is given in the same unit as the known camera distance difference. If also the detector pixel size $$\delta$$ is known (e.g. 46 mm / 508 px), $$PC_z$$ can be given in the fraction of the detector screen height

$PC_z = \frac{\mathrm{DD}}{N_r \delta b},$

where $$N_r$$ is the number of detector rows and $$b$$ is the binning factor.

Let’s first import necessary libraries

[1]:

# Exchange inline for notebook or qt5 (from pyqt) for interactive plotting
%matplotlib inline

from diffsims.crystallography import ReciprocalLatticeVector
from orix.crystal_map import Phase
from orix.quaternion import Rotation
import matplotlib.pyplot as plt
import numpy as np
import kikuchipy as kp


We will find an estimate of the PC from two single crystal Silicon EBSD patterns, which are included in the kikuchipy.data module

[2]:

s_in = kp.data.silicon_ebsd_moving_screen_in(allow_download=True)
s_in.remove_static_background()
s_in.remove_dynamic_background()

s_out5mm.remove_static_background()
s_out5mm.remove_dynamic_background()

[########################################] | 100% Completed | 103.99 ms
[########################################] | 100% Completed | 103.28 ms
[########################################] | 100% Completed | 106.77 ms
[########################################] | 100% Completed | 102.07 ms


As a first approximation, we can find the detector pixel positions of the same features in both patterns by plotting them and noting the coordinates in the upper right in the Matplotlib window when plotting with an interactive backend (e.g. qt5 or notebook) and hovering over image pixels

[3]:

fig, (ax0, ax1) = plt.subplots(ncols=2, sharex=True, sharey=True, figsize=(20, 10))
ax0.imshow(s_in.data, cmap="gray")
_ = ax1.imshow(s_out5mm.data, cmap="gray")


For this example we choose the positions of three zone axes. The PC calibration is performed by creating an instance of the PCCalibrationMovingScreen class

[4]:

cal = kp.detectors.PCCalibrationMovingScreen(
pattern_in=s_in.data,
pattern_out=s_out5mm.data,
points_in=[(109, 131), (390, 139), (246, 232)],
points_out=[(77, 146), (424, 156), (246, 269)],
delta_z=5,
px_size=None,  # Default
convention="tsl",  # Default
)
cal

[4]:

PCCalibrationMovingScreen: (PCx, PCy, PCz) = (0.5123, 0.8606, 21.6518)
3 points:
[[[109 131]
[390 139]
[246 232]]

[[ 77 146]
[424 156]
[246 269]]]


We see that ($$PC_x$$, $$PC_y$$) = (0.5123, 0.8606), while DD = 21.7 mm. To get $$PC_z$$ in fractions of detector height, we have to provide the detector pixel size $$\delta$$ upon initialization, or set it directly and recalculate the PC

[5]:

cal.px_size = 46 / 508  # mm/px
cal

[5]:

PCCalibrationMovingScreen: (PCx, PCy, PCz) = (0.5123, 0.8606, 0.4981)
3 points:
[[[109 131]
[390 139]
[246 232]]

[[ 77 146]
[424 156]
[246 269]]]


We can visualize the estimation by using the convenience method PCCalibrationMovingScreen.plot()

[6]:

cal.plot()


As expected, the three lines in the right figure meet at approimately the same point. We can replot the three images and zoom in on the PC to see how close they are to each other

[7]:

# PCy defined from top to bottom, otherwise "tsl", defined from bottom to top
cal.convention = "bruker"
pcx, pcy, _ = cal.pc

# Use two standard deviations of all $PC_x$ estimates as the axis limits
# (scaled with pattern shape)
two_std = 2 * np.std(cal.pcx_all, axis=0)

fig = cal.plot(return_figure=True)
ax2 = fig.axes[2]
ax2.set_xlim([cal.ncols * (pcx - two_std), cal.ncols * (pcx + two_std)])
_ = ax2.set_ylim([cal.nrows * (pcy - two_std), cal.nrows * (pcy + two_std)])


We can use this PC estimate as an initial guess when refining the PC using Hough indexing available from the PyEBSDIndex Python package.

Note

kikuchipy cannot depend on PyEBSDIndex at the moment, as PyEBSDIndex does not support all the combinations of Python versions and operating systems that kikuchipy does. To install PyEBSDIndex, see their installation instructions.

PyEBSDIndex supports indexing face centered and body centered cubic (FCC and BCC) materials.

[8]:

from pyebsdindex import ebsd_index, pcopt

[9]:

indexer = ebsd_index.EBSDIndexer(
vendor="KIKUCHIPY", sampleTilt=70, camElev=0, patDim=cal.shape
)

[10]:

pc_ref = pcopt.optimize(s_in.data, indexer, cal.pc)

# Compare initial guess and refined PC
print(cal.pc)
print(pc_ref)

[0.51234319 0.13935302 0.49814811]
[0.51711618 0.15022853 0.48105299]


Let’s index the pattern and plot a geometrical simulation on top of the “in” pattern using Kikuchi band centers and zone axes from the five $$\{hkl\}$$ families $$\{111\}$$, $$\{200\}$$, $$\{220\}$$, $$\{222\}$$, and $$\{311\}$$

[11]:

# Hough indexing to get orientation
data, *_ = indexer.index_pats(s_in.data, indexer, PC=pc_ref)
rot = Rotation(data[-1]["quat"])

# Create simulator
phase = Phase(space_group=227)
ref = ReciprocalLatticeVector(
phase=phase, hkl=[[1, 1, 1], [2, 0, 0], [2, 2, 0], [2, 2, 2], [3, 1, 1]]
)
ref = ref.symmetrise()
simulator = kp.simulations.KikuchiPatternSimulator(ref)

# Specify detector to simulate a pattern for and project onto the detector
detector = kp.detectors.EBSDDetector(
cal.shape, pc=pc_ref, sample_tilt=indexer.sampleTilt
)
sim = simulator.on_detector(detector, rot)

Finding bands that are in some pattern:
[########################################] | 100% Completed | 102.19 ms
Finding zone axes that are in some pattern:
[########################################] | 100% Completed | 103.67 ms
Calculating detector coordinates for bands and zone axes:
[########################################] | 100% Completed | 101.59 ms

[12]:

sim.plot(coordinates="gnomonic", pattern=s_in.data, zone_axes_labels=False)