Data preparation

Without dropout

The gap here means that the network learned very well the features from the training set but does not generalize to the validation data. This is what overfitting means.

Adding dropout

There is no gap anymore between the training and validation. It means the network has not learned too much from the training but retained the essence.