Easy to use recipes in diverse domains using diverse frameworks.

Convolutional autoencoder

2D convolution autoencoder for house numbers.

2D Convolution

Adding convolution to the MNIST classification.


Adding pooling layers.

1D Convolution

1D CNN on text towards classification of reviews.

Generalization through regularization

Making a model more generic (less training data dependent) through regularization.

Experimenting with optimizers

On the effect optimizers have on the accuracy.

Hidden layers and units


Regularization through dropout

Avoiding overfitting through dropout layers.

Using weights to account for imbalanced data

Applying more weight on the lesser present (Imbalanced) data.