alex.ml.gmm package

Submodules

alex.ml.gmm.gmm module

class alex.ml.gmm.gmm.GMM(n_features=1, n_components=1, thresh=0.001, min_covar=0.001, n_iter=1)[source]

This is a GMM model of the input data. It is memory efficient so that it can process very large input array like objects.

The mixtures are incrementally added by splitting the heaviest component in two components and perturbation of the original mean.

expectation(x)[source]

Evaluate one example

fit(X)[source]
load_model(file_name)[source]

Load the model from a pickle.load

log_multivariate_normal_density_diag(x, means=0.0, covars=1.0)[source]

Compute Gaussian log-density at X for a diagonal model

mixup(n_new_mixies)[source]

Add n new mixies to the mixture.

save_model(file_name)[source]

Save the GMM model as a pickle.

score(x)[source]

Get the log prob of the x variable being generated by the mixture.

Module contents

class alex.ml.gmm.GMM(n_features=1, n_components=1, thresh=0.001, min_covar=0.001, n_iter=1)[source]

This is a GMM model of the input data. It is memory efficient so that it can process very large input array like objects.

The mixtures are incrementally added by splitting the heaviest component in two components and perturbation of the original mean.

expectation(x)[source]

Evaluate one example

fit(X)[source]
load_model(file_name)[source]

Load the model from a pickle.load

log_multivariate_normal_density_diag(x, means=0.0, covars=1.0)[source]

Compute Gaussian log-density at X for a diagonal model

mixup(n_new_mixies)[source]

Add n new mixies to the mixture.

save_model(file_name)[source]

Save the GMM model as a pickle.

score(x)[source]

Get the log prob of the x variable being generated by the mixture.