Weight Initialization The aim of weight initialization is to prevent layer action outputs from exploding or vanishing during the course of a forward pass through a deep neural network. If either occurs, loss gradients will either be too large or too small to flow backwards beneficially, and the network will take longer to converge.1. Xavier Initialization It initializes the weights in your netwo..