diamondback.models package

Submodules

diamondback.models.DiversityModel module

Description

A diversity model realizes the selection and retention of a state as a finite collection of observations extracted from an incident signal, to maximize a minimum distance between any members of a state, according to a specified style or distance metric.

\[d_{k} = \min(\ d_{u,v}\ )\quad\quad u, v \in [\ 0,\ M\ ),\ u \neq v\]
\[d_{k} \geq d_{n}\qquad \longrightarrow\qquad d_{n} = d_{k}\]

A diversity model is an opportunistic unsupervised learning model which typically improves condition and numerical accuracy and reduces storage relative to alternative approaches including generalized linear inverse.

A state array of a specified order is defined. A stationary dimension is inferred. A style and order are specified.

Style is in ( ‘Chebyshev’, ‘Euclidean’, ‘Geometric’, ‘Manhattan’ ).

  • ‘Chebyshev’ distance is an L-infinity norm, a maximum absolute difference
    in any dimension.
\[d_{u,v} = \max(\ |\ \vec{x_{u}} - \vec{x_{v}}\ |\ )\]
  • ‘Euclidean’ distance is an L-2 norm, a square root of a sum of squared
    differences in each dimension.
\[d_{u,v} = \matrix{\sum_{i=0}^{N}(\ |\ \vec{x_{u,i}} - \vec{x_{v,i}}\ )^2|}^{0.5}\]
  • ‘Geometric’ distance is a ordered root of a product of absolute differences
    in each dimension.
\[d_{u,v} = \prod_{i=0}^{N}{(\ |\ \vec{x_{u,i}} - \vec{x_{v,i}}\ |\ )}^{\frac{1}{N}}\]
  • ‘Manhattan’ distance is an L-1 norm, a sum of absolute differences in each
    dimension.
\[d_{u,v} = \sum_{i=0}^{N}{\ (\ |\ \vec{x_{u}} - \vec{x_{v}}\ |\ )\ }\]

Example

from diamondback import DiversityModel

# Create an instance.

obj = DiversityModel( style = 'Euclidean', order = 4 )

# Learn an incident signal and extract a state.

x = numpy.random.rand( 32, 2 )
y = obj.learn( x )
s = obj.s
License

BSD-3C. © 2018 - 2024 Larry Turner, Schneider Electric Industries SAS. All rights reserved.

Author

Larry Turner, Schneider Electric, AI Hub, 2018-02-08.

class diamondback.models.DiversityModel.DiversityModel(style: str, order: int)[source]

Bases: object

Diversity model.

Initialize.

Arguments :

style : str - in ( ‘Chebyshev’, ‘Euclidean’, ‘Geometric’, ‘Manhattan’ ). order : int.

DISTANCE = {'Chebyshev': <function DiversityModel.<lambda>>, 'Euclidean': <function DiversityModel.<lambda>>, 'Geometric': <function DiversityModel.<lambda>>, 'Manhattan': <function DiversityModel.<lambda>>}
STYLE = ('Chebyshev', 'Euclidean', 'Geometric', 'Manhattan')
clear() None[source]

Clears an instance.

learn(x: list | ndarray) ndarray[source]

Learns an incident signal and produces a reference signal.

Arguments :

x : Union[ list, numpy.ndarray ] - incident signal.

Returns :

y : numpy.ndarray - diversity.

property s

diamondback.models.GaussianMixtureModel module

Description

A Gaussian Mixture Model (GMM) is a semi-supervised learning probabilistic model instance which uses maximum likelihood estimation, regularization, and expectation maximization to maximize posterior probability and classify an incident signal. Learns model instances of a specified order per class, where intra-class models capture mixture distributions.

Example

from diamondback import GaussianMixtureModel

# Create an instance.

obj = GaussianMixtureModel( order = 10, index = 100 )

# Learn an incident signal and predict classification.

x, y = numpy.random.rand( 32, 2 ), numpy.random.randint( 0, 10, 32 )
obj.learn( x, y )
x = numpy.random.rand( 16, 2 )
v = obj.predict( x )
License

BSD-3C. © 2018 - 2024 Larry Turner, Schneider Electric Industries SAS. All rights reserved.

Author

Larry Turner, Schneider Electric, AI Hub, 2018-02-08.

class diamondback.models.GaussianMixtureModel.GaussianMixtureModel(order: int = 10, index: int = 100, regularize: float = 0.1)[source]

Bases: object

Gaussian mixture model.

Initialize.

Arguments :

order : int - mixture distributions per class. index : int - iterations. regularize : float - regularize.

property index
learn(x: ndarray, y: ndarray) None[source]

Learns an incident signal with ground truth label and estimates inverse covariance and mean matrices to learn mixed distribution instances for each class.

Arguments :

x : numpy.ndarray ( batch, count ) - incident. y : numpy.ndarray ( batch ) - label.

property order
predict(x: ndarray) ndarray[source]

Predicts an estimate of ground truth label from an incident signal and maximizes posterior probability of weighted intra-class mixed distributions.

Predictions for each class are ranked and ordered by decending probability, and the initial prediction is the most likely class.

Arguments :

x : numpy.ndarray ( batch, count ) - data.

Returns :

v : numpy.ndarray ( batch, class ) - predict.

property regularize
property shape

diamondback.models.GaussianModel module

Description

A Gaussian Model (GM) is a supervised learning probabilistic model instance which uses maximum likelihood estimation and regularization to maximize posterior probability and classify an incident signal. Learns one distribution instance per class.

Example

from diamondback import GaussianModel

# Create an instance.

obj = GaussianModel( )

# Learn an incident signal and predict classification.

x, y = numpy.random.rand( 32, 2 ), numpy.random.randint( 0, 10, 32 )
obj.learn( x, y )
x = numpy.random.rand( 16, 2 )
v = obj.predict( x )
License

BSD-3C. © 2018 - 2024 Larry Turner, Schneider Electric Industries SAS. All rights reserved.

Author

Larry Turner, Schneider Electric, AI Hub, 2018-02-08.

class diamondback.models.GaussianModel.GaussianModel(regularize: float = 0.1)[source]

Bases: object

Gaussian model.

Initialize.

Arguments :

regularize : float - regularize.

learn(x: ndarray, y: ndarray) None[source]

Learns an incident signal with ground truth label and estimates inverse covariance and mean matrices to learn a distribution instance for each class.

Arguments :

x : numpy.ndarray ( batch, count ) - incident. y : numpy.ndarray ( batch ) - label.

predict(x: ndarray) ndarray[source]

Predicts an estimate of ground truth label from an incident signal and maximizes posterior probability.

Predictions for each class are ranked and ordered by decending probability, and the initial prediction is the most likely class.

Arguments :

x : numpy.ndarray ( batch, count ) - data.

Returns :

v : numpy.ndarray ( batch, class ) - predict.

property regularize
property shape

Module contents

Description

Initialize.

License

BSD-3C. © 2018 - 2024 Larry Turner, Schneider Electric Industries SAS. All rights reserved.

Author

Larry Turner, Schneider Electric, AI Hub, 2018-03-22.

class diamondback.models.DiversityModel(style: str, order: int)[source]

Bases: object

Diversity model.

Initialize.

Arguments :

style : str - in ( ‘Chebyshev’, ‘Euclidean’, ‘Geometric’, ‘Manhattan’ ). order : int.

DISTANCE = {'Chebyshev': <function DiversityModel.<lambda>>, 'Euclidean': <function DiversityModel.<lambda>>, 'Geometric': <function DiversityModel.<lambda>>, 'Manhattan': <function DiversityModel.<lambda>>}
STYLE = ('Chebyshev', 'Euclidean', 'Geometric', 'Manhattan')
clear() None[source]

Clears an instance.

learn(x: list | ndarray) ndarray[source]

Learns an incident signal and produces a reference signal.

Arguments :

x : Union[ list, numpy.ndarray ] - incident signal.

Returns :

y : numpy.ndarray - diversity.

property s
class diamondback.models.GaussianMixtureModel(order: int = 10, index: int = 100, regularize: float = 0.1)[source]

Bases: object

Gaussian mixture model.

Initialize.

Arguments :

order : int - mixture distributions per class. index : int - iterations. regularize : float - regularize.

property index
learn(x: ndarray, y: ndarray) None[source]

Learns an incident signal with ground truth label and estimates inverse covariance and mean matrices to learn mixed distribution instances for each class.

Arguments :

x : numpy.ndarray ( batch, count ) - incident. y : numpy.ndarray ( batch ) - label.

property order
predict(x: ndarray) ndarray[source]

Predicts an estimate of ground truth label from an incident signal and maximizes posterior probability of weighted intra-class mixed distributions.

Predictions for each class are ranked and ordered by decending probability, and the initial prediction is the most likely class.

Arguments :

x : numpy.ndarray ( batch, count ) - data.

Returns :

v : numpy.ndarray ( batch, class ) - predict.

property regularize
property shape
class diamondback.models.GaussianModel(regularize: float = 0.1)[source]

Bases: object

Gaussian model.

Initialize.

Arguments :

regularize : float - regularize.

learn(x: ndarray, y: ndarray) None[source]

Learns an incident signal with ground truth label and estimates inverse covariance and mean matrices to learn a distribution instance for each class.

Arguments :

x : numpy.ndarray ( batch, count ) - incident. y : numpy.ndarray ( batch ) - label.

predict(x: ndarray) ndarray[source]

Predicts an estimate of ground truth label from an incident signal and maximizes posterior probability.

Predictions for each class are ranked and ordered by decending probability, and the initial prediction is the most likely class.

Arguments :

x : numpy.ndarray ( batch, count ) - data.

Returns :

v : numpy.ndarray ( batch, class ) - predict.

property regularize
property shape