Vincent Szolnoky: Model Gradient Similarity
At RISE Learning Machines Seminar on Feb 2 2023, we have the pleasure to listen to Vincent Szolnoky, Chalmers, give his talk: On the Interpretability of Regularisation for Neural Networks Through Model Gradient Similarity. Seminariet är på engelska.
– Artificial neural networks often require the aid of explicit regularisers. In this talk, a new framework named Model Gradient Similarity (MGS) will be introduced.
Abstract
Most complex machine learning and modelling techniques are prone to over-fitting and may subsequently generalise poorly to future data. Artificial neural networks are no different in this regard and, despite having a level of implicit regularisation when trained with gradient descent, often require the aid of explicit regularisers. In this talk, a new framework named Model Gradient Similarity (MGS) shall be introduced, that (1) serves as a metric of regularisation, which can be used to monitor neural network training, (2) adds insight into how explicit regularisers, while derived from widely different principles, operate via the same mechanism underneath by increasing MGS, and (3) provides the basis for a new regularisation scheme which exhibits excellent performance, especially in challenging settings such as high levels of label noise or limited sample sizes.
About the speaker
Vincent Szolnoky is an industrial PhD student at Chalmers University of Technology and Centiro. His research is centered around training and regularisation of Neural Networks. Most recently, Vincent’s paper on Model Gradient Similarity and its connection to regularisation for Neural Networks was accepted to NeurIPS 2022, one of the largest machine learning conferences, where it was well received and garnered much attention.