- LazyEBSD.get_decomposition_model_write(components: Optional[Union[int, List[int]]] = None, dtype_learn: Union[str, dtype, type] = 'float32', mbytes_chunk: int = 100, dir_out: Optional[str] = None, fname_out: Optional[str] = None) None #
Write the model signal generated from the selected number of principal components directly to an
The model signal intensities are rescaled to the original signals’ data type range, keeping relative intensities.
If not given, rebuilds the signal from all
int, rebuilds signal from
componentsin range 0-given
int. If list of
int, rebuilds signal from only
componentsin given list.
Data type to set learning results to (default is
"float32") before multiplication.
Size of learning results chunks in MB, default is 100 MB as suggested in the Dask documentation.
Directory to place output signal in.
Name of output signal file.
Multiplying the learning results’ factors and loadings in memory to create the model signal cannot sometimes be done due to too large matrices. Here, instead, learning results are written to file, read into dask arrays and multiplied using
dask.array.matmul(), out of core.