admin 管理员组文章数量: 1086019
I'm using torch LightningModule
trainer.
I create trainer with:
trainer = pl.Trainer(max_epochs = 3)
Each train epoch has 511 steps (total = 1533) and each validation epoch has 127 steps.
I use the self.global_step
in training_step
func and log it (with wandb
):
wandb.log({"train_step": self.global_step})
Zoom:
1 As you can see, it seems that the train_step
in wandb
contains the training steps and the validation steps. why ?
2 How can I view just the training values (with the training steps) in wandb
?
本文标签: torchWhy the globalstep (training step) is no sync with the wandb plot stepsStack Overflow
版权声明:本文标题:torch - Why the global_step (training step) is no sync with the wandb plot steps? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://roclinux.cn/p/1744050214a2524881.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论