Skip to content

Instantly share code, notes, and snippets.

@ceteri
Last active July 7, 2020 06:38
Show Gist options
  • Select an option

  • Save ceteri/07979d0fc68f7f79fed2b71347e598ea to your computer and use it in GitHub Desktop.

Select an option

Save ceteri/07979d0fc68f7f79fed2b71347e598ea to your computer and use it in GitHub Desktop.

Revisions

  1. ceteri renamed this gist Jul 7, 2020. 1 changed file with 0 additions and 0 deletions.
    File renamed without changes.
  2. ceteri revised this gist Jul 7, 2020. 1 changed file with 4 additions and 4 deletions.
    8 changes: 4 additions & 4 deletions r22.py
    Original file line number Diff line number Diff line change
    @@ -3,9 +3,9 @@
    config = ppo.DEFAULT_CONFIG.copy()
    config["log_level"] = "WARN"

    config["num_workers"] = 4 # default = 2
    config["train_batch_size"] = 10000 # default = 4000
    config["sgd_minibatch_size"] = 256 # default = 128
    config["evaluation_num_episodes"] = 50 # default = 10
    config["num_workers"] = 4 # default = 2
    config["train_batch_size"] = 10000 # default = 4000
    config["sgd_minibatch_size"] = 256 # default = 128
    config["evaluation_num_episodes"] = 50 # default = 10

    agent = ppo.PPOTrainer(config, env=SELECT_ENV)
  3. ceteri created this gist Jul 7, 2020.
    11 changes: 11 additions & 0 deletions r22.py
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,11 @@
    SELECT_ENV = "MountainCar-v0"

    config = ppo.DEFAULT_CONFIG.copy()
    config["log_level"] = "WARN"

    config["num_workers"] = 4 # default = 2
    config["train_batch_size"] = 10000 # default = 4000
    config["sgd_minibatch_size"] = 256 # default = 128
    config["evaluation_num_episodes"] = 50 # default = 10

    agent = ppo.PPOTrainer(config, env=SELECT_ENV)