Architectures
Sampling
Sampling
Sampling (*args, **kwargs)
*Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::
import torch.nn as nn
import torch.nn.functional as F
class Model(nn.Module):
def __init__(self) -> None:
super().__init__()
self.conv1 = nn.Conv2d(1, 20, 5)
self.conv2 = nn.Conv2d(20, 20, 5)
def forward(self, x):
x = F.relu(self.conv1(x))
return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.
.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool*
Callback
VAELossHistory
VAELossHistory ()
*Abstract base class used to build new callbacks.
Subclass this class and override any of the relevant hooks*
VAEs Encoders and Decoders
VAEEncoder
VAEEncoder (latent_dim:int)
Helper class that provides a standard way to create an ABC using inheritance.
VAEDecoder
VAEDecoder (latent_dim:int)
Helper class that provides a standard way to create an ABC using inheritance.
Simple Convolutional Architecture
Kernel Sizes: 5, 7, 9, 13
Encoder
Conv5Encoder
Conv5Encoder (seq_len:int, feat_dim:int, latent_dim:int, dropout_rate:float)
Helper class that provides a standard way to create an ABC using inheritance.
Decoder
Conv5Decoder
Conv5Decoder (seq_len:int, feat_dim:int, latent_dim:int, dropout_rate:float)
Helper class that provides a standard way to create an ABC using inheritance.
Getter
get_conv5_vae_components
get_conv5_vae_components (seq_len:int, feat_dim:int, latent_dim:int, dropout_rate:float=0.2)
Creates and returns encoder and decoder components for a convolutional VAE architecture.
| Type | Default | Details | |
|---|---|---|---|
| seq_len | int | Length of input sequence | |
| feat_dim | int | Dimensionality of input features | |
| latent_dim | int | Dimensionality of the latent space | |
| dropout_rate | float | 0.2 | Dropout rate for regularization |
| Returns | tuple |
Legit Tsgm
Encoder
Conv5EncoderLegitTsgm
Conv5EncoderLegitTsgm (seq_len:int, feat_dim:int, latent_dim:int, dropout_rate:float)
Helper class that provides a standard way to create an ABC using inheritance.
Decoder
Conv5DecoderLegitTsgm
Conv5DecoderLegitTsgm (seq_len:int, feat_dim:int, latent_dim:int, dropout_rate:float)
Helper class that provides a standard way to create an ABC using inheritance.
Getter
get_conv5_legit_tsgm_vae_components
get_conv5_legit_tsgm_vae_components (seq_len:int, feat_dim:int, latent_dim:int, dropout_rate:float=0.2)
Creates and returns encoder and decoder components for a Conv5 VAE model.
| Type | Default | Details | |
|---|---|---|---|
| seq_len | int | Length of input sequence | |
| feat_dim | int | Dimensionality of input features | |
| latent_dim | int | Dimensionality of the latent space | |
| dropout_rate | float | 0.2 | Dropout rate for regularization |
| Returns | tuple |
Inception Time
The implemetation in the following cell is taken from https://github.com/TheMrGhostman/InceptionTime-Pytorch/blob/master/inception.py , the next cell is an adjustment for our problem
InceptionBlock
InceptionBlock (in_channels, n_filters=32, kernel_sizes=[9, 19, 39], bottleneck_channels=32, use_residual=True, activation=ReLU(), return_indices=False)
*Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::
import torch.nn as nn
import torch.nn.functional as F
class Model(nn.Module):
def __init__(self) -> None:
super().__init__()
self.conv1 = nn.Conv2d(1, 20, 5)
self.conv2 = nn.Conv2d(20, 20, 5)
def forward(self, x):
x = F.relu(self.conv1(x))
return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.
.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool*
Inception
Inception (in_channels, n_filters, kernel_sizes=[9, 19, 39], bottleneck_channels=32, activation=ReLU(), return_indices=False)
*Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::
import torch.nn as nn
import torch.nn.functional as F
class Model(nn.Module):
def __init__(self) -> None:
super().__init__()
self.conv1 = nn.Conv2d(1, 20, 5)
self.conv2 = nn.Conv2d(20, 20, 5)
def forward(self, x):
x = F.relu(self.conv1(x))
return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.
.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool*
pass_through
pass_through (X)
correct_sizes
correct_sizes (sizes)
InceptionTransposeBlockWithoutPool
InceptionTransposeBlockWithoutPool (in_channels, out_channels=32, kernel_sizes=[9, 19, 39], bottleneck_channels=32, use_residual=True, activation=ReLU())
*Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::
import torch.nn as nn
import torch.nn.functional as F
class Model(nn.Module):
def __init__(self) -> None:
super().__init__()
self.conv1 = nn.Conv2d(1, 20, 5)
self.conv2 = nn.Conv2d(20, 20, 5)
def forward(self, x):
x = F.relu(self.conv1(x))
return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.
.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool*
InceptionTransposeWithoutPool
InceptionTransposeWithoutPool (in_channels, out_channels, kernel_sizes=[9, 19, 39], bottleneck_channels=32, activation=ReLU())
*Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::
import torch.nn as nn
import torch.nn.functional as F
class Model(nn.Module):
def __init__(self) -> None:
super().__init__()
self.conv1 = nn.Conv2d(1, 20, 5)
self.conv2 = nn.Conv2d(20, 20, 5)
def forward(self, x):
x = F.relu(self.conv1(x))
return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.
.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool*
InceptionBlockWithoutPool
InceptionBlockWithoutPool (in_channels, n_filters=32, kernel_sizes=[9, 19, 39], bottleneck_channels=32, use_residual=True, activation=ReLU(), return_indices=False)
*Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::
import torch.nn as nn
import torch.nn.functional as F
class Model(nn.Module):
def __init__(self) -> None:
super().__init__()
self.conv1 = nn.Conv2d(1, 20, 5)
self.conv2 = nn.Conv2d(20, 20, 5)
def forward(self, x):
x = F.relu(self.conv1(x))
return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.
.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool*
InceptionWithoutPool
InceptionWithoutPool (in_channels, n_filters, kernel_sizes=[9, 19, 39], bottleneck_channels=32, activation=ReLU())
*Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::
import torch.nn as nn
import torch.nn.functional as F
class Model(nn.Module):
def __init__(self) -> None:
super().__init__()
self.conv1 = nn.Conv2d(1, 20, 5)
self.conv2 = nn.Conv2d(20, 20, 5)
def forward(self, x):
x = F.relu(self.conv1(x))
return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.
.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.
:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool*
Encoder
InceptionTimeVAEEncoder
InceptionTimeVAEEncoder (feat_dim=7, seq_len=100, n_filters=32, kernel_sizes=[5, 11, 23], bottleneck_channels=32, latent_dim=2)
Helper class that provides a standard way to create an ABC using inheritance.
WPInceptionTimeVAEEncoder
WPInceptionTimeVAEEncoder (feat_dim=7, seq_len=100, n_filters=32, kernel_sizes=[5, 11, 23], bottleneck_channels=32, latent_dim=2)
Helper class that provides a standard way to create an ABC using inheritance.
Decoder
WPInceptionTimeVAEDecoder
WPInceptionTimeVAEDecoder (feat_dim=7, seq_len=100, n_filters=32, kernel_sizes=[5, 11, 23], bottleneck_channels=32, latent_dim=2)
Helper class that provides a standard way to create an ABC using inheritance.
Getter
get_inception_time_vae_components
get_inception_time_vae_components (seq_len:int, feat_dim:int, latent_dim:int, without_pooling:bool=True, **model_kwargs:dict)
Returns encoder and decoder components for an InceptionTime-based VAE architecture.
| Type | Default | Details | |
|---|---|---|---|
| seq_len | int | Length of input sequence | |
| feat_dim | int | Dimensionality of input features | |
| latent_dim | int | Dimensionality of the latent space | |
| without_pooling | bool | True | If True, returns WPInceptionTimeVAEEncoder instead of InceptionTimeVAEEncoder |
| model_kwargs | dict | ||
| Returns | tuple | Dictionary containing model-specific keyword arguments |
cVAEs Encoders and Decoders
cVAEEncoder
cVAEEncoder (latent_dim:int)
Abstract base for a conditional VAE encoder: Encodes data + condition into z_mean, z_log_var
cVAEDecoder
cVAEDecoder (latent_dim:int)
Abstract base for a conditional VAE decoder: Decodes z + condition into reconstructed data
Simple Convolutions
cConv5EncoderLegitTsgm
cConv5EncoderLegitTsgm (seq_len:int, feat_dim:int, latent_dim:int, cond_dim:int, dropout_rate:float)
Abstract base for a conditional VAE encoder: Encodes data + condition into z_mean, z_log_var
cConv5DecoderLegitTsgm
cConv5DecoderLegitTsgm (seq_len:int, feat_dim:int, latent_dim:int, cond_dim:int, dropout_rate:float)
Abstract base for a conditional VAE decoder: Decodes z + condition into reconstructed data
get_conditional_conv5_legit_tsgm_vae_components
get_conditional_conv5_legit_tsgm_vae_components (seq_len:int, feat_dim:int, latent_dim:int, dropout_rate:float=0.2, cond_dim:int=1)
Creates encoder and decoder components for a conditional convolutional VAE.
| Type | Default | Details | |
|---|---|---|---|
| seq_len | int | Length of input sequence | |
| feat_dim | int | Dimensionality of input features | |
| latent_dim | int | Dimensionality of the latent space | |
| dropout_rate | float | 0.2 | Dropout rate for regularization |
| cond_dim | int | 1 | Dimensionality of conditional input |
| Returns | tuple |