trainers¶
-
class
transfer_nlp.plugins.trainers.
BasicTrainer
(model: torch.nn.Module, dataset_splits: transfer_nlp.loaders.loaders.DatasetSplits, loss: torch.nn.Module, optimizer: torch.optim.Optimizer, metrics: Dict[str, ignite.metrics.Metric], experiment_config: transfer_nlp.plugins.config.ExperimentConfig, device: str = None, num_epochs: int = 1, seed: int = None, cuda: bool = None, loss_accumulation_steps: int = 4, scheduler: Any = None, regularizer: transfer_nlp.plugins.regularizers.RegularizerABC = None, gradient_clipping: float = 1.0, output_transform=None, tensorboard_logs: str = None, embeddings_name: str = None, finetune: bool = False)[source]¶
This class contains the abstraction interface to customize runners. For the training loop, we use the engine logic from pytorch-ignite
Check experiments for examples of experiment json files