gigl.experimental.knowledge_graph_embedding.lib.checkpoint#
Attributes#
Functions#
| 
 | Load the model and optimizer checkpoints if they exist. | 
| 
 | Save the model and optimizer checkpoints if specified in the training configuration. | 
Module Contents#
- gigl.experimental.knowledge_graph_embedding.lib.checkpoint.maybe_load_checkpoint(model, optimizer, checkpointing_config)[source]#
- Load the model and optimizer checkpoints if they exist. - Parameters:
- model (torch.nn.Module) – The model to load the checkpoint into. 
- optimizer (torch.optim.Optimizer) – The optimizer to load the checkpoint into. 
- checkpointing_config (gigl.experimental.knowledge_graph_embedding.lib.config.training.CheckpointingConfig) – The training configuration containing the checkpointing paths. 
 
- Returns:
- True if the model and optimizer were loaded successfully, False otherwise. 
- Return type:
- bool 
 
- gigl.experimental.knowledge_graph_embedding.lib.checkpoint.maybe_save_checkpoint(model, optimizer, checkpointing_config, checkpoint_id='')[source]#
- Save the model and optimizer checkpoints if specified in the training configuration. - Parameters:
- model (torch.nn.Module) – The model to save the checkpoint for. 
- optimizer (torch.optim.Optimizer) – The optimizer to save the checkpoint for. 
- checkpointing_config (gigl.experimental.knowledge_graph_embedding.lib.config.training.CheckpointingConfig) – The training configuration containing the checkpointing paths. 
- checkpoint_id (str) – An optional identifier for the checkpoint, used to differentiate between checkpoints if needed. 
 
- Returns:
- The URI where the checkpoint was saved, or a Future object if saved asynchronously. If no checkpointing path is specified, returns None. 
- Return type:
 
