Neural networks basics

abstaction

Neural networks have been all the rage in recent times. They have revolutionized language translation, image recognition, game playing, self driving cars and god knows what else. This event would help you learn the basics of a feedforward neural network. We will use the keras library with Google’s tensorflow backend to teach this class. You can play around with the running model and visualize it using tensorboard during the class(and offline later). Among the many topics, you will learn: a) What are the components of a neural work? A neuron, output unit, hidden unit, cost function, activation function. b)Why are there multiple layers? c) How powerful can the neural networks be? d) How are neural networks trained? e) Theory behind Batch and Stochastic minibatch gradient descents. f)Optimization algorithms including Stochastic Gradient Descent (SGD), Momemtum-based Methods such as Adam, RMSprop, Nesterov Momentum, Adagrad etc. g) Parameter Initializations h) Challenges Encountered