MKS Models

MKSHomogenizationModel

class pymks.mks_homogenization_model.MKSHomogenizationModel(basis=None, dimension_reducer=None, n_components=None, property_linker=None, degree=1, correlations=None, compute_correlations=True)

The MKSHomogenizationModel takes in microstructures and a their associated macroscopic property, and created a low dimensional structure property linkage. The MKSHomogenizationModel model is designed to integrate with dimensionality reduction techniques and predictive models.

degree

Degree of the polynomial used by property_linker.

n_components

Number of components used by dimension_reducer.

dimension_reducer

Instance of a dimensionality reduction class.

property_linker

Instance of class that maps materials property to the microstuctures.

correlations

spatial correlations to be computed

basis

instance of a basis class

reduced_fit_data

Low dimensionality representation of spatial correlations used to fit the model.

preduced_predict_data

Low dimensionality representation of spatial correlations predicted by the model.

Below is an examlpe of using MKSHomogenizationModel to predict (or classify) the type of microstructure using PCA and Logistic Regression.

>>> n_states = 3
>>> domain = [-1, 1]
>>> from pymks.bases import LegendreBasis
>>> leg_basis = LegendreBasis(n_states=n_states, domain=domain)
>>> from sklearn.decomposition import PCA
>>> from sklearn.linear_model import LogisticRegression
>>> reducer = PCA(n_components=3)
>>> linker = LogisticRegression()
>>> model = MKSHomogenizationModel(
...     basis=leg_basis, dimension_reducer=reducer, property_linker=linker)
>>> from pymks.datasets import make_cahn_hilliard
>>> X0, X1 = make_cahn_hilliard(n_samples=50)
>>> y0 = np.zeros(X0.shape[0])
>>> y1 = np.ones(X1.shape[0])
>>> X = np.concatenate((X0, X1))
>>> y = np.concatenate((y0, y1))
>>> model.fit(X, y)
>>> X0_test, X1_test = make_cahn_hilliard(n_samples=3)
>>> y0_test = model.predict(X0_test)
>>> y1_test = model.predict(X1_test)
>>> assert np.allclose(y0_test, [0, 0, 0])
>>> assert np.allclose(y1_test, [1, 1, 1])

Create an instance of a MKSHomogenizationModel.

Parameters:
  • basis (class, optional) – an instance of a bases class.
  • dimension_reducer (class, optional) – an instance of a dimensionality reduction class with a fit_transform method.
  • property_linker (class, optional) – an instance for a machine learning class with fit and predict methods.
  • n_components (int, optional) – number of components kept by the dimension_reducer
  • degree (int, optional) – degree of the polynomial used by property_linker.
  • correlations (list, optional) – list of spatial correlations to compute, default is the autocorrelation with the first local state and all of its cross correlations. For example if basis has n_states=3, correlation would be [(0, 0), (0, 1), (0, 2)]
  • compute_correlations (boolean, optional) – If false spatial correlations will not be calculated as part of the fit and predict methods. The spatial correlations can be passed as X to both methods, default is True.
fit(X, y, reduce_labels=None, periodic_axes=None, confidence_index=None, size=None)

Fits data by calculating 2-point statistics from X, preforming dimension reduction using dimension_reducer, and fitting the reduced data with the property_linker.

Parameters:
  • X (ND array) – The microstructures or spatial correlations, a (n_samples, n_x, ...) shaped array where n_samples is the number of samples and n_x is the spatial discretization.
  • y (1D array) – The material property associated with X.
  • reducer_labels (1D array, optional) – label for X used during the fit_transform method for the dimension_reducer.
  • periodic_axes (list, optional) – axes that are periodic. (0, 2) would indicate that axes x and z are periodic in a 3D microstrucure.
  • confidence_index (ND array, optional) – array with same shape as X used to assign a confidence value for each data point.

Example

>>> from sklearn.decomposition import PCA
>>> from sklearn.linear_model import LinearRegression
>>> from pymks.bases import PrimitiveBasis
>>> from pymks.stats import correlate
>>> reducer = PCA(n_components=2)
>>> linker = LinearRegression()
>>> prim_basis = PrimitiveBasis(n_states=2, domain=[0, 1])
>>> correlations = [(0, 0), (1, 1), (0, 1)]
>>> model = MKSHomogenizationModel(prim_basis,
...                                dimension_reducer=reducer,
...                                property_linker=linker,
...                                correlations=correlations)
>>> np.random.seed(99)
>>> X = np.random.randint(2, size=(3, 15))
>>> y = np.array([1, 2, 3])
>>> model.fit(X, y)
>>> X_ = prim_basis.discretize(X)
>>> X_stats = correlate(X_)
>>> X_reshaped = X_stats.reshape((X_stats.shape[0], X_stats[0].size))
>>> X_pca = reducer.fit_transform(X_reshaped - np.mean(X_reshaped,
...                               axis=1)[:, None])
>>> assert np.allclose(model.reduced_fit_data, X_pca)

Now let’s use the same method with spatial correlations instead of microtructures.

>>> from sklearn.decomposition import PCA
>>> from sklearn.linear_model import LinearRegression
>>> from pymks.bases import PrimitiveBasis
>>> from pymks.stats import correlate
>>> reducer = PCA(n_components=2)
>>> linker = LinearRegression()
>>> prim_basis = PrimitiveBasis(n_states=2, domain=[0, 1])
>>> correlations = [(0, 0), (1, 1), (0, 1)]
>>> model = MKSHomogenizationModel(dimension_reducer=reducer,
...                                property_linker=linker,
...                                compute_correlations=False)
>>> np.random.seed(99)
>>> X = np.random.randint(2, size=(3, 15))
>>> y = np.array([1, 2, 3])
>>> X_ = prim_basis.discretize(X)
>>> X_stats = correlate(X_, correlations=correlations)
>>> model.fit(X_stats, y)
>>> X_reshaped = X_stats.reshape((X_stats.shape[0], X_stats[0].size))
>>> X_pca = reducer.fit_transform(X_reshaped - np.mean(X_reshaped,
...                               axis=1)[:, None])
>>> assert np.allclose(model.reduced_fit_data, X_pca)
predict(X, periodic_axes=None, confidence_index=None)

Predicts macroscopic property for the microstructures X.

Parameters:
  • X (ND array) – The microstructure, an (n_samples, n_x, ...) shaped array where n_samples is the number of samples and n_x is the spatial discretization.
  • periodic_axes (list, optional) – axes that are periodic. (0, 2) would indicate that axes x and z are periodic in a 3D microstrucure.
  • confidence_index (ND array, optional) – array with same shape as X used to assign a confidence value for each data point.
Returns:

The predicted macroscopic property for X.

Example

>>> from sklearn.manifold import LocallyLinearEmbedding
>>> from sklearn.linear_model import BayesianRidge
>>> from pymks.bases import PrimitiveBasis
>>> np.random.seed(99)
>>> X = np.random.randint(2, size=(50, 100))
>>> y = np.random.random(50)
>>> reducer = LocallyLinearEmbedding()
>>> linker = BayesianRidge()
>>> prim_basis = PrimitiveBasis(2, domain=[0, 1])
>>> model = MKSHomogenizationModel(prim_basis, n_components=2,
...                                dimension_reducer=reducer,
...                                property_linker=linker)
>>> model.fit(X, y)
>>> X_test = np.random.randint(2, size=(1, 100))

Predict with microstructures

>>> y_pred = model.predict(X_test)

Predict with spatial correlations

>>> from pymks.stats import correlate
>>> model.compute_correlations = False
>>> X_ = prim_basis.discretize(X_test)
>>> X_corr = correlate(X_, correlations=[(0, 0), (0, 1)])
>>> y_pred_stats = model.predict(X_corr)
>>> assert y_pred_stats == y_pred
score(X, y, periodic_axes=None, confidence_index=None)

The score function for the MKSHomogenizationModel. It formats the data and uses the score method from the property_linker.

Parameters:
  • X (ND array) – The microstructure, an (n_samples, n_x, ...) shaped array where n_samples is the number of samples and n_x is the spatial discretization.
  • y (1D array) – The material property associated with X.
  • periodic_axes (list, optional) – axes that are periodic. (0, 2) would indicate that axes x and z are periodic in a 3D microstrucure.
  • confidence_index (ND array, optional) – array with same shape as X used to assign a confidence value for each data point.
Returns:

Score for MKSHomogenizationModel from the selected property_linker.

MKSLocalizationModel

class pymks.mks_localization_model.MKSLocalizationModel(basis, n_states=None)

The MKSLocalizationModel fits data using the Materials Knowledge System in Fourier Space. The following demonstrates the viability of the MKSLocalizationModel with a simple 1D filter.

basis

Basis function used to discretize the microstucture.

n_states

Interger value for number of local states, if a basis is specified, n_states indicates the order of the polynomial.

coef

Array of values that are the influence coefficients.

>>> n_states = 2
>>> n_spaces = 81
>>> n_samples = 400

Define a filter function.

>>> def filter(x):
...     return np.where(x < 10,
...                     np.exp(-abs(x)) * np.cos(x * np.pi),
...                     np.exp(-abs(x - 20)) * np.cos((x - 20) * np.pi))

Use the filter function to construct some coefficients.

>>> coeff = np.linspace(1, 0, n_states)[None,:] * filter(np.linspace(0, 20,
...                                                      n_spaces))[:,None]
>>> Fcoeff = np.fft.fft(coeff, axis=0)

Make some test samples.

>>> np.random.seed(2)
>>> X = np.random.random((n_samples, n_spaces))

Construct a response with the Fcoeff.

>>> H = np.linspace(0, 1, n_states)
>>> X_ = np.maximum(1 - abs(X[:,:,None] - H) / (H[1] - H[0]), 0)
>>> FX = np.fft.fft(X_, axis=1)
>>> Fy = np.sum(Fcoeff[None] * FX, axis=-1)
>>> y = np.fft.ifft(Fy, axis=1).real

Use the MKSLocalizationModel to reconstruct the coefficients

>>> from .bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(n_states, [0, 1])
>>> model = MKSLocalizationModel(basis=prim_basis)
>>> model.fit(X, y)

Check the result

>>> assert np.allclose(np.fft.fftshift(coeff, axes=(0,)), model.coeff)

Instantiate a MKSLocalizationModel.

Parameters:
  • basis (class) – an instance of a bases class.
  • n_states (int, optional) – number of local states
coeff

Returns the coefficients in real space with origin shifted to the center.

fit(X, y, size=None)

Fits the data by calculating a set of influence coefficients.

Parameters:
  • X (ND array) – The microstructure, an (n_samples, n_x, ...) shaped array where n_samples is the number of samples and n_x is the spatial discretization.
  • y (ND array) – The response field, same shape as X.
  • size (tuple, optional) – Alters the shape of X and y during the calibration of the influence coefficients. If None, the size of the influence coefficients is the same shape as X and y.

Example

>>> X = np.linspace(0, 1, 4).reshape((1, 2, 2))
>>> y = X.swapaxes(1, 2)
>>> from .bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(2, [0, 1])
>>> model = MKSLocalizationModel(basis=prim_basis)
>>> model.fit(X, y)
>>> assert np.allclose(model._filter.Fkernel, [[[ 0.5,  0.5],
...                                             [  -2,    0]],
...                                            [[-0.5,  0  ],
...                                             [  -1,  0  ]]])
predict(X)

Predicts a new response from the microstructure function X with calibrated influence coefficients.

Parameters:X (ND array) – The microstructure, an (n_samples, n_x, ...) shaped array where n_samples is the number of samples and n_x is the spatial discretization.
Returns:The predicted response field the same shape as X.

Example

>>> X = np.linspace(0, 1, 4).reshape((1, 2, 2))
>>> y = X.swapaxes(1, 2)
>>> from .bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(2, [0, 1])
>>> model = MKSLocalizationModel(basis=prim_basis)
>>> model.fit(X, y)
>>> assert np.allclose(y, model.predict(X))

The fit method must be called to calibrate the coefficients before the predict method can be used.

>>> MKSModel = MKSLocalizationModel(prim_basis)
>>> MKSModel.predict(X)
Traceback (most recent call last):
...
AttributeError: fit() method must be run before predict().
resize_coeff(size)

Scale the size of the coefficients and pad with zeros.

Parameters:size (tuple) – The new size of the influence coefficients.
Returns:The resized influence coefficients to size.

Example

Let’s first instantitate a model and fabricate some coefficients.

>>> from pymks.bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(n_states=2)
>>> model = MKSLocalizationModel(prim_basis)
>>> coeff = np.arange(20).reshape((5, 4, 1))
>>> coeff = np.concatenate((coeff , np.ones_like(coeff)), axis=2)
>>> coeff = np.fft.ifftshift(coeff, axes=(0, 1))
>>> model._filter = Filter(np.fft.fftn(coeff, axes=(0, 1))[None])

The coefficients can be reshaped by passing the new shape that coefficients should have.

>>> model.resize_coeff((10, 7))
>>> assert np.allclose(model.coeff[:,:,0],
...                    [[0, 0, 0, 0, 0, 0, 0],
...                     [0, 0, 0, 0, 0, 0, 0],
...                     [0, 0, 0, 0, 0, 0, 0],
...                     [0, 0, 0, 1, 2, 3, 0],
...                     [0, 0, 4, 5, 6, 7, 0],
...                     [0, 0, 8, 9,10,11, 0],
...                     [0, 0,12,13,14,15, 0],
...                     [0, 0,16,17,18,19, 0],
...                     [0, 0, 0, 0, 0, 0, 0],
...                     [0, 0, 0, 0, 0, 0, 0]])