gigl.experimental.knowledge_graph_embedding.lib.config.sampling#
Classes#
| Configuration for negative sampling strategy during knowledge graph embedding training. | 
Module Contents#
- class gigl.experimental.knowledge_graph_embedding.lib.config.sampling.SamplingConfig[source]#
- Configuration for negative sampling strategy during knowledge graph embedding training. - Negative sampling is crucial for contrastive learning in knowledge graph embeddings, where the model learns to distinguish between true (positive) and false (negative) edges. - negative_corruption_side[source]#
- Which side of the edge to corrupt for negative sampling. NegativeSamplingCorruptionType.DST corrupts the destination node, NegativeSamplingCorruptionType.SRC corrupts the source node. Defaults to NegativeSamplingCorruptionType.DST. 
 - positive_edge_batch_size[source]#
- Number of positive (true) edges to process in each batch. Controls memory usage and training stability. Defaults to 1024. - Type:
- int 
 
 - num_inbatch_negatives_per_edge[source]#
- Number of negative samples generated per positive edge using other edges in the same batch. This is memory-efficient but may have limited diversity. Defaults to 0 (disabled). - Type:
- int 
 
 - num_random_negatives_per_edge[source]#
- Number of negative samples generated per positive edge by randomly corrupting nodes. Provides high diversity but requires more computation. Defaults to 1024. - Type:
- int 
 
 - negative_corruption_side: gigl.experimental.knowledge_graph_embedding.lib.model.types.NegativeSamplingCorruptionType[source]#
 
