holocron.utils#

holocron.utils provides some utilities for general usage.

Miscellaneous#

holocron.utils.lr_finder(batch_training_fn, model, train_loader, optimizer, criterion, device=None, start_lr=1e-07, end_lr=10, num_it=100, stop_div=True, stop_threshold=2, beta=0.9)[source]#

Learning rate finder as described in “Cyclical Learning Rates for Training Neural Networks”

Parameters:
  • batch_training_fn (float) – function used to train a model for a step

  • model (torch.Tensor) – model to train

  • train_loader (torch.utils.data.DataLoader) – training dataloader

  • optimizer (torch.optim.Optimizer) – model parameter optimizer

  • criterion (nn.Module) – loss computation function

  • device (torch.device, optional) – device to perform iterations on

  • start_lr (float) – initial learning rate

  • end_lr (float) – peak learning rate

  • num_it (int) – number of iterations to perform

  • stop_div (bool) – should the evaluation be stopped if loss diverges

  • stop_threshold (float) – if stop_div is True, stops the evaluation when loss reaches stop_threshold * best_loss

  • beta (float) – smoothing parameter for loss

Returns:

list of used learning rates losses (list<float>): list of training losses

Return type:

lrs (list<float>)