gigl.experimental.knowledge_graph_embedding.lib.checkpoint#

Attributes#

Functions#

maybe_load_checkpoint(model, optimizer, ...)

Load the model and optimizer checkpoints if they exist.

maybe_save_checkpoint(model, optimizer, ...[, ...])

Save the model and optimizer checkpoints if specified in the training configuration.

Module Contents#

gigl.experimental.knowledge_graph_embedding.lib.checkpoint.maybe_load_checkpoint(model, optimizer, checkpointing_config)[source]#

Load the model and optimizer checkpoints if they exist.

Parameters:
Returns:

True if the model and optimizer were loaded successfully, False otherwise.

Return type:

bool

gigl.experimental.knowledge_graph_embedding.lib.checkpoint.maybe_save_checkpoint(model, optimizer, checkpointing_config, checkpoint_id='')[source]#

Save the model and optimizer checkpoints if specified in the training configuration.

Parameters:
  • model (torch.nn.Module) – The model to save the checkpoint for.

  • optimizer (torch.optim.Optimizer) – The optimizer to save the checkpoint for.

  • checkpointing_config (gigl.experimental.knowledge_graph_embedding.lib.config.training.CheckpointingConfig) – The training configuration containing the checkpointing paths.

  • checkpoint_id (str) – An optional identifier for the checkpoint, used to differentiate between checkpoints if needed.

Returns:

The URI where the checkpoint was saved, or a Future object if saved asynchronously. If no checkpointing path is specified, returns None.

Return type:

Optional[Union[Future[Uri], Uri]]

gigl.experimental.knowledge_graph_embedding.lib.checkpoint.logger[source]#