Neural networks form the foundation for deep learning, the most advanced and popular machine learning technique in use today. This course provides an introduction to neural networks. It begins with an overview of a neural network's basic concepts and building blocks - neurons, weights, activations, and layers - before explaining how to train one using gradient descent. The optimization technique is explained with a visual example and different issues such as parameter initialization and model validation are discussed. The course covers the different types of neural network architectures, explains the differences between them, and illustrates practical applications for each. Because training a neural network can be very slow, the course will offer up some tricks for speeding up the process and improving results. The course ends with a review of the history of this fascinating field, from its origin to its fall, and then its subsequent rise in modern days. Requirements include a clear understanding of supervised learning and optimization.
Angie Ma, Gary Willis, and Alessandra Stagliano are data scientists with ASI Data Science, a London based AI/machine learning solutions firm. Angie co-founded ASI and is also the founder of Data Science Lab London, one of the biggest communities of data scientists and data engineers in Europe, with over 2,500 members. Angie holds a PhD in physics from London's University College, Gary Willis holds a PhD in statistical physics from London's Imperial College, and Alessandra Stagliano holds a PhD in computer science from the University of Genoa. Collectively, the group has worked on over 150 commercial AI/machine learning projects.