Epoch is one complete run of a training dataset through an algorithm in machine learning.
What is Epoch?
In the world of neural networks, an epoch is one loop of the entire training dataset. It usually takes many epochs to train a neural network. In other words, if you give a neural network training data with different patterns for more than one epoch, you can expect an improvement in generalization if you give it fresh unobserved input data (test data).
With each epoch, the model parameters underlying the data set change. As a consequence, the batch gradient descent algorithm is named after each epoch batch. The batch volume is usually 1 or greater, and it is always an integer value in the epoch number. Alternatively, it can be represented as a loop for- with a specific number, with each loop route going through the entire training dataset.
If the value of "batch size" is specified as one, the for-loop contains a layer that allows it to pass through a given sample in one batch. To determine how many epochs the model must go through to learn, several parameters are needed, related to both the data and the purpose of the model. To translate this procedure into an algorithm usually requires a deep understanding of the data.
This time is used to determine what events will happen on the blockchain network, such as when incentives will be distributed or when a new group of validators will be assigned to check transactions. Each blockchain protocol defines this time period differently. Typically, it is the time it takes to complete a certain number of blocks in the chain.
As another example, Cardano is a blockchain system in which an epoch is designated as a unit of time. Cardano (ADA) uses Ouroboros Praos, a specialized Proof-of-Stake (PoS) consensus method that divides the blockchain into five-day epochs. Epochs are then divided into slots, each consisting of 20-second intervals. Currently, 432,000 slots (five days) are available for each epoch.