-
Couldn't load subscription status.
- Fork 6.8k
Open
Labels
bugSomething that is supposed to be working; but isn'tSomething that is supposed to be working; but isn'tcommunity-backlogrllibRLlib related issuesRLlib related issuesstabilitytriageNeeds triage (eg: priority, bug/not-bug, and owning component)Needs triage (eg: priority, bug/not-bug, and owning component)
Description
What happened + What you expected to happen
In ray/rllib/algorithms/algorithm_config.py (around line 4546), the construction of a MultiRLModuleSpec fails to pass along the model_config, which leads to missing configuration during runtime. The current snippet is:
# Now construct the proper MultiRLModuleSpec.
# We need to infer the multi-agent class from `current_rl_module_spec`
# and fill in the module_specs dict.
multi_rl_module_spec = current_rl_module_spec.__class__(
multi_rl_module_class=current_rl_module_spec.multi_rl_module_class,
rl_module_specs=module_specs,
modules_to_load=current_rl_module_spec.modules_to_load,
load_state_path=current_rl_module_spec.load_state_path,
)However, it should probably be:
# Now construct the proper MultiRLModuleSpec.
# We need to infer the multi-agent class from `current_rl_module_spec`
# and fill in the module_specs dict.
multi_rl_module_spec = current_rl_module_spec.__class__(
multi_rl_module_class=current_rl_module_spec.multi_rl_module_class,
rl_module_specs=module_specs,
modules_to_load=current_rl_module_spec.modules_to_load,
load_state_path=current_rl_module_spec.load_state_path,
model_config=current_rl_module_spec.model_config
)I was trying to use a custom MultiRLModule that needed some global configurations and I was unable to pass it to the env_runners by using the model_config argument, causing the Algorithm.build_algo method to fail. Adding the line fixed the problem.
Versions / Dependencies
- Ray version: 2.48.0
- Python version: 3.10
Reproduction script
from ray.rllib.core.rl_module.torch import TorchRLModule
from ray.rllib.core.rl_module import MultiRLModule, MultiRLModuleSpec, RLModuleSpec
from ray.rllib.algorithms import AlgorithmConfig, DQNConfig
from ray.rllib.env.multi_agent_env_runner import MultiAgentEnvRunner
from ray.rllib.examples.multi_agent.multi_agent_cartpole import MultiAgentCartPole
import gymnasium as gym
class CustomMultiRLModule(MultiRLModule):
def setup(self):
super().setup()
assert self.model_config is not None
spec = MultiRLModuleSpec(
multi_rl_module_class=CustomMultiRLModule,
rl_module_specs={
'agent_1': RLModuleSpec(
TorchRLModule,
observation_space=gym.spaces.Box(0, 1),
action_space=gym.spaces.Box(0, 1)
)
},
model_config={'some_config': 1}
)
# This works fine.
module = spec.build()
# This doesn't, because of the missing parameter in `AlgorithmConfig` class.
algo_config = (
DQNConfig()
.environment(MultiAgentCartPole)
.rl_module(rl_module_spec=spec)
.multi_agent(
policies={'agent_1'},
policy_mapping_fn=lambda agent_id, episode, worker, **kwargs: agent_id
)
)
MultiAgentEnvRunner(algo_config)Issue Severity
Medium: It is a significant difficulty but I can work around it.
BenyaminGhN and MatthewCWeston
Metadata
Metadata
Assignees
Labels
bugSomething that is supposed to be working; but isn'tSomething that is supposed to be working; but isn'tcommunity-backlogrllibRLlib related issuesRLlib related issuesstabilitytriageNeeds triage (eg: priority, bug/not-bug, and owning component)Needs triage (eg: priority, bug/not-bug, and owning component)