ray.rllib.algorithms.algorithm.Algorithm.env_runner_group#
- Algorithm.env_runner_group: EnvRunnerGroup | None = None#
The
EnvRunnerGroupof the Algorithm. AnEnvRunnerGroupis composed of a single localEnvRunner(see:self.env_runner), serving as the reference copy of the models to be trained and optionally one or more remoteEnvRunnersused to generate training samples from the RL environment, in parallel. EnvRunnerGroup is fault-tolerant and elastic. It tracks health states for all the managed remote EnvRunner actors. As a result, Algorithm should never access the underlying actor handles directly. Instead, always access them via all the foreach APIs with assigned IDs of the underlying EnvRunners.