admin 管理员组

文章数量: 1086019

I'm using torch LightningModule trainer. I create trainer with:

 trainer  = pl.Trainer(max_epochs = 3)

Each train epoch has 511 steps (total = 1533) and each validation epoch has 127 steps.

I use the self.global_step in training_step func and log it (with wandb):

 wandb.log({"train_step": self.global_step})

Zoom:

1 As you can see, it seems that the train_step in wandb contains the training steps and the validation steps. why ?

2 How can I view just the training values (with the training steps) in wandb ?

本文标签: torchWhy the globalstep (training step) is no sync with the wandb plot stepsStack Overflow