gigl.distributed.sampler#

Attributes#

Classes#

ABLPNodeSamplerInput

Sampler input specific for ABLP use case. Contains additional information about positive labels, negative labels, and the corresponding

Functions#

metadata_key_with_prefix(key)

Prefixes the key with "#META

Module Contents#

class gigl.distributed.sampler.ABLPNodeSamplerInput(node, input_type, positive_label_by_edge_types, negative_label_by_edge_types)[source]#

Bases: graphlearn_torch.sampler.NodeSamplerInput

Sampler input specific for ABLP use case. Contains additional information about positive labels, negative labels, and the corresponding supervision node type

Parameters:
  • node (torch.Tensor) – Anchor nodes to fanout from

  • input_type (Optional[Union[str, NodeType]]) – Node type of the anchor nodes

  • positive_label_by_edge_types (dict[EdgeType, torch.Tensor]) – Positive label nodes to fanout from

  • negative_label_by_edge_types (dict[EdgeType, torch.Tensor]) – Negative label nodes to fanout from

property negative_label_by_edge_types: dict[gigl.src.common.types.graph_data.EdgeType, torch.Tensor][source]#
Return type:

dict[gigl.src.common.types.graph_data.EdgeType, torch.Tensor]

property positive_label_by_edge_types: dict[gigl.src.common.types.graph_data.EdgeType, torch.Tensor][source]#
Return type:

dict[gigl.src.common.types.graph_data.EdgeType, torch.Tensor]

gigl.distributed.sampler.metadata_key_with_prefix(key)[source]#

Prefixes the key with “#META Do this as GLT also does this. alibaba/graphlearn-for-pytorch

Parameters:

key (str)

Return type:

str

gigl.distributed.sampler.NEGATIVE_LABEL_METADATA_KEY: Final[str] = 'gigl_negative_labels.'[source]#
gigl.distributed.sampler.POSITIVE_LABEL_METADATA_KEY: Final[str] = 'gigl_positive_labels.'[source]#