Skip to main content

Personal tools

You are here: Home / News & Events / [Lecture Series] Mini-batch stochastic gradient descent with dynamic sample and step sizes

[Lecture Series] Mini-batch stochastic gradient descent with dynamic sample and step sizes

By Dr. Michael Metel, May 16, 2017, 4:30-5:30 pm

Advanced Optimization Laboratory will host a talk by Dr. Michael Metel from Laboratoire de Recherche en InformatiqueUniversité Paris Sud on May 16, 2017. In this seminar session, Dr. Metel will present the stochastic gradient descent and its variants, with a particular focus on the mini-batch implementation. Basic convergence results will be presented as well as an overview of the current research done on dynamically chosen sample sizes. New sample and step size rules will be presented as well as a preliminary study examining the usefulness of the mini-batch methodology in an industrial application. For more information, please visit the AdvOL Optimization Seminar page. Refreshment will be provided. This invited seminar session will be preceded by the student seminar presentation by George Manoussakis entitled "listing all fixed length simple cycles and paths in sparse graphs in optimal time". 




Title: Mini-batch stochastic gradient descent with dynamic sample and step sizes
Presenter: Dr. Michael Metel
Date: Tuesday, May 16, 2017
Time: 4:30 pm to 5:30 pm
Location: ITB 201, McMaster University