How to get a progress bar in PyTorch

Aladdin Persson
1 min readJan 29, 2021

Let us start with the basic imports where we will be using tqdm for our progress bar:

Let’s create a simple toy dataset example using TensorDataset that we imported above. This is just to make a simple example, so for our dataset we will just generate random numbers. Replace the cell below loading the dataset (doesn’t matter which) of your choice.

Let’s create a very simple model and training loops:

Here we set loss and acc to a random value but here you would set important information you previously computed. This is what it will look like after it’s finished:

Epoch [0/3]: 100%|██████████████████████████████████| 125/125 [00:02<00:00, 42.25it/s, acc=0.776, loss=0.0617]
Epoch [1/3]: 100%|██████████████████████████████████| 125/125 [00:02<00:00, 41.70it/s, acc=0.0216, loss=0.668]
Epoch [2/3]: 100%|██████████████████████████████████| 125/125 [00:03<00:00, 41.23it/s, acc=0.0701, loss=0.912]

Alright so it basically looks identical to how we normally set up our loops in PyTorch. The only difference is that we instead set loop = tqdm(loader) and then we can also add additional information to the progress bar like current (running) accuracy as well as loss for the current batch. Personally I always like to use a progress bar to know how long things will take and I recommend you to do it too! :)

--

--

Aladdin Persson

I think I like to train Deep Neural Nets on large datasets, but idk because my gpu is a potato