The MKSHomogenizationModel takes in microstructures and a their associated macroscopic property, and created a low dimensional structure property linkage. The MKSHomogenizationModel model is designed to integrate with dimensionality reduction techniques and predictive models.
Degree of the polynomial used by property_linker.
Number of components used by dimension_reducer.
Instance of a dimensionality reduction class.
Instance of class that maps materials property to the microstuctures.
spatial correlations to be computed
instance of a basis class
Low dimensionality representation of spatial correlations used to fit the model.
Low dimensionality representation of spatial correlations predicted by the model.
Below is an examlpe of using MKSHomogenizationModel to predict (or classify) the type of microstructure using PCA and Logistic Regression.
>>> n_states = 3
>>> domain = [1, 1]
>>> from pymks.bases import LegendreBasis
>>> leg_basis = LegendreBasis(n_states=n_states, domain=domain)
>>> from sklearn.decomposition import PCA
>>> from sklearn.linear_model import LogisticRegression
>>> reducer = PCA(n_components=3)
>>> linker = LogisticRegression()
>>> model = MKSHomogenizationModel(
... basis=leg_basis, dimension_reducer=reducer, property_linker=linker)
>>> from pymks.datasets import make_cahn_hilliard
>>> X0, X1 = make_cahn_hilliard(n_samples=50)
>>> y0 = np.zeros(X0.shape[0])
>>> y1 = np.ones(X1.shape[0])
>>> X = np.concatenate((X0, X1))
>>> y = np.concatenate((y0, y1))
>>> model.fit(X, y)
>>> X0_test, X1_test = make_cahn_hilliard(n_samples=3)
>>> y0_test = model.predict(X0_test)
>>> y1_test = model.predict(X1_test)
>>> assert np.allclose(y0_test, [0, 0, 0])
>>> assert np.allclose(y1_test, [1, 1, 1])
Create an instance of a MKSHomogenizationModel.
Parameters: 


Fits data by calculating 2point statistics from X, preforming dimension reduction using dimension_reducer, and fitting the reduced data with the property_linker.
Parameters: 


Example
>>> from sklearn.decomposition import PCA
>>> from sklearn.linear_model import LinearRegression
>>> from pymks.bases import PrimitiveBasis
>>> from pymks.stats import correlate
>>> reducer = PCA(n_components=2)
>>> linker = LinearRegression()
>>> prim_basis = PrimitiveBasis(n_states=2, domain=[0, 1])
>>> correlations = [(0, 0), (1, 1), (0, 1)]
>>> model = MKSHomogenizationModel(prim_basis,
... dimension_reducer=reducer,
... property_linker=linker,
... correlations=correlations)
>>> np.random.seed(99)
>>> X = np.random.randint(2, size=(3, 15))
>>> y = np.array([1, 2, 3])
>>> model.fit(X, y)
>>> X_ = prim_basis.discretize(X)
>>> X_stats = correlate(X_)
>>> X_reshaped = X_stats.reshape((X_stats.shape[0], X_stats[0].size))
>>> X_pca = reducer.fit_transform(X_reshaped  np.mean(X_reshaped,
... axis=1)[:, None])
>>> assert np.allclose(model.reduced_fit_data, X_pca)
Now let’s use the same method with spatial correlations instead of microtructures.
>>> from sklearn.decomposition import PCA
>>> from sklearn.linear_model import LinearRegression
>>> from pymks.bases import PrimitiveBasis
>>> from pymks.stats import correlate
>>> reducer = PCA(n_components=2)
>>> linker = LinearRegression()
>>> prim_basis = PrimitiveBasis(n_states=2, domain=[0, 1])
>>> correlations = [(0, 0), (1, 1), (0, 1)]
>>> model = MKSHomogenizationModel(dimension_reducer=reducer,
... property_linker=linker,
... compute_correlations=False)
>>> np.random.seed(99)
>>> X = np.random.randint(2, size=(3, 15))
>>> y = np.array([1, 2, 3])
>>> X_ = prim_basis.discretize(X)
>>> X_stats = correlate(X_, correlations=correlations)
>>> model.fit(X_stats, y)
>>> X_reshaped = X_stats.reshape((X_stats.shape[0], X_stats[0].size))
>>> X_pca = reducer.fit_transform(X_reshaped  np.mean(X_reshaped,
... axis=1)[:, None])
>>> assert np.allclose(model.reduced_fit_data, X_pca)
Predicts macroscopic property for the microstructures X.
Parameters: 


Returns:  The predicted macroscopic property for X. 
Example
>>> from sklearn.manifold import LocallyLinearEmbedding
>>> from sklearn.linear_model import BayesianRidge
>>> from pymks.bases import PrimitiveBasis
>>> np.random.seed(99)
>>> X = np.random.randint(2, size=(50, 100))
>>> y = np.random.random(50)
>>> reducer = LocallyLinearEmbedding()
>>> linker = BayesianRidge()
>>> prim_basis = PrimitiveBasis(2, domain=[0, 1])
>>> model = MKSHomogenizationModel(prim_basis, n_components=2,
... dimension_reducer=reducer,
... property_linker=linker)
>>> model.fit(X, y)
>>> X_test = np.random.randint(2, size=(1, 100))
Predict with microstructures
>>> y_pred = model.predict(X_test)
Predict with spatial correlations
>>> from pymks.stats import correlate
>>> model.compute_correlations = False
>>> X_ = prim_basis.discretize(X_test)
>>> X_corr = correlate(X_, correlations=[(0, 0), (0, 1)])
>>> y_pred_stats = model.predict(X_corr)
>>> assert y_pred_stats == y_pred
The score function for the MKSHomogenizationModel. It formats the data and uses the score method from the property_linker.
Parameters: 


Returns:  Score for MKSHomogenizationModel from the selected property_linker. 
The MKSLocalizationModel fits data using the Materials Knowledge System in Fourier Space. The following demonstrates the viability of the MKSLocalizationModel with a simple 1D filter.
Basis function used to discretize the microstucture.
Interger value for number of local states, if a basis is specified, n_states indicates the order of the polynomial.
Array of values that are the influence coefficients.
>>> n_states = 2
>>> n_spaces = 81
>>> n_samples = 400
Define a filter function.
>>> def filter(x):
... return np.where(x < 10,
... np.exp(abs(x)) * np.cos(x * np.pi),
... np.exp(abs(x  20)) * np.cos((x  20) * np.pi))
Use the filter function to construct some coefficients.
>>> coeff = np.linspace(1, 0, n_states)[None,:] * filter(np.linspace(0, 20,
... n_spaces))[:,None]
>>> Fcoeff = np.fft.fft(coeff, axis=0)
Make some test samples.
>>> np.random.seed(2)
>>> X = np.random.random((n_samples, n_spaces))
Construct a response with the Fcoeff.
>>> H = np.linspace(0, 1, n_states)
>>> X_ = np.maximum(1  abs(X[:,:,None]  H) / (H[1]  H[0]), 0)
>>> FX = np.fft.fft(X_, axis=1)
>>> Fy = np.sum(Fcoeff[None] * FX, axis=1)
>>> y = np.fft.ifft(Fy, axis=1).real
Use the MKSLocalizationModel to reconstruct the coefficients
>>> from .bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(n_states, [0, 1])
>>> model = MKSLocalizationModel(basis=prim_basis)
>>> model.fit(X, y)
Check the result
>>> assert np.allclose(np.fft.fftshift(coeff, axes=(0,)), model.coeff)
Instantiate a MKSLocalizationModel.
Parameters: 


Returns the coefficients in real space with origin shifted to the center.
Fits the data by calculating a set of influence coefficients.
Parameters: 


Example
>>> X = np.linspace(0, 1, 4).reshape((1, 2, 2))
>>> y = X.swapaxes(1, 2)
>>> from .bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(2, [0, 1])
>>> model = MKSLocalizationModel(basis=prim_basis)
>>> model.fit(X, y)
>>> assert np.allclose(model._filter.Fkernel, [[[ 0.5, 0.5],
... [ 2, 0]],
... [[0.5, 0 ],
... [ 1, 0 ]]])
Predicts a new response from the microstructure function X with calibrated influence coefficients.
Parameters:  X (ND array) – The microstructure, an (n_samples, n_x, ...) shaped array where n_samples is the number of samples and n_x is the spatial discretization. 

Returns:  The predicted response field the same shape as X. 
Example
>>> X = np.linspace(0, 1, 4).reshape((1, 2, 2))
>>> y = X.swapaxes(1, 2)
>>> from .bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(2, [0, 1])
>>> model = MKSLocalizationModel(basis=prim_basis)
>>> model.fit(X, y)
>>> assert np.allclose(y, model.predict(X))
The fit method must be called to calibrate the coefficients before the predict method can be used.
>>> MKSModel = MKSLocalizationModel(prim_basis)
>>> MKSModel.predict(X)
Traceback (most recent call last):
...
AttributeError: fit() method must be run before predict().
Scale the size of the coefficients and pad with zeros.
Parameters:  size (tuple) – The new size of the influence coefficients. 

Returns:  The resized influence coefficients to size. 
Example
Let’s first instantitate a model and fabricate some coefficients.
>>> from pymks.bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(n_states=2)
>>> model = MKSLocalizationModel(prim_basis)
>>> coeff = np.arange(20).reshape((5, 4, 1))
>>> coeff = np.concatenate((coeff , np.ones_like(coeff)), axis=2)
>>> coeff = np.fft.ifftshift(coeff, axes=(0, 1))
>>> model._filter = Filter(np.fft.fftn(coeff, axes=(0, 1))[None])
The coefficients can be reshaped by passing the new shape that coefficients should have.
>>> model.resize_coeff((10, 7))
>>> assert np.allclose(model.coeff[:,:,0],
... [[0, 0, 0, 0, 0, 0, 0],
... [0, 0, 0, 0, 0, 0, 0],
... [0, 0, 0, 0, 0, 0, 0],
... [0, 0, 0, 1, 2, 3, 0],
... [0, 0, 4, 5, 6, 7, 0],
... [0, 0, 8, 9,10,11, 0],
... [0, 0,12,13,14,15, 0],
... [0, 0,16,17,18,19, 0],
... [0, 0, 0, 0, 0, 0, 0],
... [0, 0, 0, 0, 0, 0, 0]])