Dropout in Data set
Dropout is a regularization technique where, while you’re updating a layer of your neural net, you randomly don’t update, or “dropout,” half of the layer. That is, while updating your neural net layer, you update each node with probability 1/2, and leave it unchanged with probability 1/2. This helps prevent the net from relying on one node in the layer too much. I haven’t read it in detail, but this is one of the earlier papers on dropout and looks like it motivates it pretty well.
The key idea is to randomly drop units (along with their connections) from the neural network during training.