Tensorflow (1.4.1) Tensorboard Visualization Plot Goes Back In Time?
I created a few summary ops throughout my graph like so: tf.summary.scalar('cross_entropy', cross_entropy) tf.summary.scalar('accuracy', accuracy) and of course merged and got a w
Solution 1:
You can't (easily) reopen and "append" to an existing events file, but that's not necessary.
Tensorboard will display sequential event files just fine, as long as the step value in the records is consistent.
When you save a summary, you specify a step
value, which indicates at which point on the x
axis the summary should be plotted.
The graph goes "back in time" because at every new run you restart the step counter from 0. To have it consistent in multiple runs, you should define a global_step
variable that is saved to the checkpoint when you save the network. This way, when you restore the network in the next training run, your global step will pick up from where it left and your graphs will not look weird anymore.
Baca Juga
- Using Tensorboard To Monitor Training Real Time And Visualize The Model Architecture
- Pkg_resources.distributionnotfound: The 'wheel>=0.26; Python_version >= "3"' Distribution Was Not Found And Is Required By Tensorboard
- In Add_summary For Value In Summary.value: Attributeerror: 'tensor' Object Has No Attribute 'value'
Post a Comment for "Tensorflow (1.4.1) Tensorboard Visualization Plot Goes Back In Time?"