gigl.distributed.utils.dist_sampler#
Sampler factory helpers shared across sampling producers.
Attributes#
Union of all supported sampler input types. |
|
Union of all supported GiGL sampler runtime types. |
Functions#
|
Create a GiGL sampler runtime for one channel on one worker. |
Module Contents#
- gigl.distributed.utils.dist_sampler.create_dist_sampler(*, data, sampling_config, worker_options, channel, sampler_options, degree_tensors, current_device)[source]#
Create a GiGL sampler runtime for one channel on one worker.
- Parameters:
data (graphlearn_torch.distributed.DistDataset) – The distributed dataset containing graph topology and features.
sampling_config (graphlearn_torch.sampler.SamplingConfig) – Configuration for sampling behavior (neighbors, edges, etc.).
worker_options (Union[graphlearn_torch.distributed.MpDistSamplingWorkerOptions, graphlearn_torch.distributed.RemoteDistSamplingWorkerOptions]) – Worker-level options (RPC settings, device placement, concurrency).
channel (graphlearn_torch.channel.ChannelBase) – The communication channel for passing sampled messages.
sampler_options (gigl.distributed.sampler_options.SamplerOptions) – Algorithm-specific options (k-hop or PPR).
degree_tensors (Optional[Union[torch.Tensor, dict[graphlearn_torch.typing.EdgeType, torch.Tensor]]]) – Pre-computed degree tensors required by PPR sampling. Must not be
Nonewhensampler_optionsisPPRSamplerOptions.current_device (torch.device) – The device on which sampling will run.
- Returns:
A configured sampler runtime, either
DistNeighborSamplerorDistPPRNeighborSampler.- Raises:
NotImplementedError – If
sampler_optionsis an unsupported type.- Return type:
SamplerRuntime