advsecurenet.shared package



advsecurenet.shared.adversarial_evaluators module

advsecurenet.shared.colors module

This file contains the colors that are used in the project for printing messages on the console.

advsecurenet.shared.loss module

class advsecurenet.shared.loss.Loss(value)

Bases: Enum

Supported loss functions for training the model. Taken from https://pytorch.org/docs/stable/nn.html#loss-functions

BCE_LOSS = <class 'torch.nn.modules.loss.BCELoss'>
BCE_WITH_LOGITS_LOSS = <class 'torch.nn.modules.loss.BCEWithLogitsLoss'>
COSINE_EMBEDDING_LOSS = <class 'torch.nn.modules.loss.CosineEmbeddingLoss'>
CROSS_ENTROPY = <class 'torch.nn.modules.loss.CrossEntropyLoss'>
HINGE_EMBEDDING_LOSS = <class 'torch.nn.modules.loss.HingeEmbeddingLoss'>
KLDIV_LOSS = <class 'torch.nn.modules.loss.KLDivLoss'>
L1_LOSS = <class 'torch.nn.modules.loss.L1Loss'>
MARGIN_RANKING_LOSS = <class 'torch.nn.modules.loss.MarginRankingLoss'>
MSELoss = <class 'torch.nn.modules.loss.MSELoss'>
MULTI_LABEL_MARGIN_LOSS = <class 'torch.nn.modules.loss.MultiLabelMarginLoss'>
MULTI_MARGIN_LOSS = <class 'torch.nn.modules.loss.MultiMarginLoss'>
NLL_LOSS = <class 'torch.nn.modules.loss.NLLLoss'>
NLL_LOSS2D = <class 'torch.nn.modules.loss.NLLLoss2d'>
POISSON_NLL_LOSS = <class 'torch.nn.modules.loss.PoissonNLLLoss'>
SMOOTH_L1_LOSS = <class 'torch.nn.modules.loss.SmoothL1Loss'>
SOFT_MARGIN_LOSS = <class 'torch.nn.modules.loss.SoftMarginLoss'>
TRIPLET_MARGIN_LOSS = <class 'torch.nn.modules.loss.TripletMarginLoss'>

advsecurenet.shared.normalization_params module

class advsecurenet.shared.normalization_params.NormalizationParameters

Bases: object

Class to handle retrieval and management of normalization parameters for selected datasets. The normalization parameters are the mean and standard deviation values for each channel of the dataset.

Supported datasets: - CIFAR-10 - CIFAR-100 - ImageNet - MNIST - SVHN - Fashion-MNIST

DATASETS = {DatasetType.CIFAR10: {'mean': [0.4914, 0.4822, 0.4465], 'std': [0.247, 0.2435, 0.2616]}, DatasetType.CIFAR100: {'mean': [0.5071, 0.4867, 0.4408], 'std': [0.2675, 0.2565, 0.2761]}, DatasetType.FASHION_MNIST: {'mean': [0.286], 'std': [0.353]}, DatasetType.IMAGENET: {'mean': [0.485, 0.456, 0.406], 'std': [0.229, 0.224, 0.225]}, DatasetType.MNIST: {'mean': [0.1307], 'std': [0.3081]}, DatasetType.SVHN: {'mean': [0.4377, 0.4438, 0.4728], 'std': [0.198, 0.201, 0.197]}}
static get_params(dataset_name: DatasetType | str) DotDict

Retrieve normalization parameters for a specified dataset. The parameters are the mean and standard deviation values for each channel of the dataset. :param dataset_name: The name of the dataset to retrieve parameters for, either as an enum or string. :type dataset_name: DatasetType or str

Returns:

A dictionary-like object containing the mean and standard deviation values for the dataset.

Return type:

DotDict

static list_datasets() list

List available datasets.

advsecurenet.shared.optimizer module

class advsecurenet.shared.optimizer.Optimizer(value)

Bases: Enum

Enum class representing different optimization algorithms.

ADAGRAD = <class 'torch.optim.adagrad.Adagrad'>
ADAM = <class 'torch.optim.adam.Adam'>
ADAMAX = <class 'torch.optim.adamax.Adamax'>
ADAMW = <class 'torch.optim.adamw.AdamW'>
ASGD = <class 'torch.optim.asgd.ASGD'>
LBFGS = <class 'torch.optim.lbfgs.LBFGS'>
RMS_PROP = <class 'torch.optim.rmsprop.RMSprop'>
R_PROP = <class 'torch.optim.rprop.Rprop'>
SGD = <class 'torch.optim.sgd.SGD'>

advsecurenet.shared.scheduler module

class advsecurenet.shared.scheduler.Scheduler(value)

Bases: Enum

Supported schedulers for learning rate decay. Taken from https://pytorch.org/docs/stable/optim.html

COSINE_ANNEALING_LR = <class 'torch.optim.lr_scheduler.CosineAnnealingLR'>
COSINE_ANNEALING_WARM_RESTARTS = <class 'torch.optim.lr_scheduler.CosineAnnealingWarmRestarts'>
CYCLIC_LR = <class 'torch.optim.lr_scheduler.CyclicLR'>
EXPONENTIAL_LR = <class 'torch.optim.lr_scheduler.ExponentialLR'>
LAMBDA_LR = <class 'torch.optim.lr_scheduler.LambdaLR'>
LINEAR_LR = <class 'torch.optim.lr_scheduler.LinearLR'>
MULTI_STEP_LR = <class 'torch.optim.lr_scheduler.MultiStepLR'>
ONE_CYCLE_LR = <class 'torch.optim.lr_scheduler.OneCycleLR'>
POLY_LR = <class 'torch.optim.lr_scheduler.PolynomialLR'>
REDUCE_LR_ON_PLATEAU = <class 'torch.optim.lr_scheduler.ReduceLROnPlateau'>
STEP_LR = <class 'torch.optim.lr_scheduler.StepLR'>