Dropout Decoded: Comprehensive Classification and Analysis of Neural Network Regularization Methods
Amir Mohammad Sharafaddini, Najme Mansouri
Artificial Intelligence Review•2025
Abstract
Neural networks are powerful tools for pattern recognition but are often prone to overfitting, limiting their real-world effectiveness. This paper surveys dropout and its numerous variants—regularization techniques that enable neural networks to generalize more effectively to unseen data. A new classification framework is proposed that organizes 50 dropout methods based on their performance across three data categories: sequential, image-based, and general datasets. This structure provides insight into the optimal application of each method. Additionally, the paper examines dropout’s role in improving performance through increased sparsity, enhanced inter-layer interactions, and reduced neuron dependency. The findings establish a foundation for further exploration of regularization strategies and assist researchers and practitioners in developing more robust and reliable neural network architectures.