Epoch/Batch size/Iteration in machine learning

Papisetty Santhosh
1 min readJul 15, 2020

many of us often get confused with these frequently used terms i.e. epochs, batch size, and Iterations.so let us dive into it and understand them.

Epoch:-

one forward and backward pass of entire training data is called an epoch.

suppose if I set epochs=20 then my model gets trained 20 times

on my entire training data set.

Batch size:-

The number of training examples in one forward and the backward pass is called batch size.

suppose if I have a training data set of 55000 images,

and if I set the batch size to 1000 then at a time 1000 images will get get trained.

totally I get 55000/1000=55 batches approximately.

Iteration:-

Iteration means the number of times the parameters of the algorithm gets updated.

in the above example we got 55 batches for one epoch , we know for every epoch our weights get changed, so we can say we need 55 iterations to complete one epoch.

on summarising if I have 55,000 training data and the batch size is 1,000. Then, we need 55 iterations to complete 1 epoch.

--

--