Neural networks optimization

Neural Network Training is one of the most difficult optimization problems to solve. It could take a lot of processing power (hundreds of machines) and lot of time (months of time) to solve even a single instance of neural network training. To address this expensive and important problem, specialized optimization techniques were developed. Want to learn how to optimize your neural networks to get better results? Our event is going to cover the following topics:a) Theory behind Batch and Stochastic minibatch gradient descents. b)Optimization algorithms including Stochastic Gradient Descent (SGD), Momemtum-based Methods such as Adam, RMSprop, Nesterov Momentum, Adagrad etc. c) Parameter Initializations d) Challenges Encountered