Electron microscopy
 
Comparison between Steps and Epochs in TensorFlow
- Python for Integrated Circuits -
- An Online Book -
Python for Integrated Circuits                                                                                   http://www.globalsino.com/ICs/        


Chapter/Index: Introduction | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | Appendix

=================================================================================

Comparison between steps and epochs are:
         i) epoch:
           i.a) An epoch means one pass over the entire training set. An epoch consists of one full cycle through the entire training data. An epoch usually means one iteration over all of the training data. An epoch is considered as number of one pass from entire dataset, namely, this usually contains many steps. A training epoch represents a complete use of all training data for gradients calculation and optimizations(train the model).
           i.b) train_and_evaluate can be used to print accuracy on test set after every epoch.
         ii) step:
           ii.a) ii) A step or iteration corresponds to one forward pass and one backward pass. A training step is one gradient update. One steps is considered as number of epochs multiplied by examples divided by batch size. A training step means using one batch size of training data to train the model.

A training set is usually divided into N mini-batches. Therefore, each step goes through one batch and you will need N steps to complete a full epoch. In most models, there is a steps parameter indicating the number of steps to run over data.

With a Keras model for training, the significance of the .fit() method is that it can define the number of epochs.

============================================

         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         

 

 

 

 

 



















































 

 

 

 

 

=================================================================================