Marcus Liwicki
[intermediate/advanced] Methods for Learning with Few Data
Summary
Deep Neural Networks are data hungry, they require millions of labelled data in order to work! — Really? — The last decade has shown useful approaches to work with less labelled data, either by having a lot of data from a similar domain or by letting the network learn meaningful representations without explicit supervision. This tutorial first brings self-supervised learning to a general perspective of learning with few data, covering typical transfer learning and auto-encoder approaches or perceptual loss. Furthermore, the tutorial will investigate some typical (mis-) conceptions of these methods and suggest some practical tips on how to learn with few data. By participating in this tutorial, you will get deep insights in representation learning and learning with few data, as well as practical tools to start working on data in your own domain.
Syllabus
- Introduction
- Motivation
- Examples
- Background
- Problem formulation
- Learning with few data
- Priors
- Approaches
- End to end learning
- Transfer learning
- Clustering
- Representation learning
- Auto-encoding
- Contrastive learning
- Comparative summary
- Remarks on contrastive learning
* and surprises in-between
References
A Survey on Deep Transfer Learning – 2018
Deep Clustering for Unsupervised Learning of Visual Features – 2018
Variational Autoencoder for Deep Learning of Images, Labels and Captions – 2016
A Pitfall of Unsupervised Pre-Training – 2017
SimCLR – July 2020, SwAV – October 2020
And more references will be given directly during the tutorial and in the slides.
Pre-requisites
Foundations of machine learning and neural networks (including backpropagation). Linear algebra. Basics of deep learning (CNN, ResNet, DenseNet, auto-encoders). Other data processing algorithms (PCA, LDA).
Short bio
Marcus Liwicki received his M.S. degree in Computer Science from the Free University of Berlin, Germany, in 2004, his PhD degree from the University of Bern, Switzerland, in 2007, and his habilitation degree at the Technical University of Kaiserslautern, Germany, in 2011. Currently he is chaired professor in Machine Learning and vice-rector for AI at Luleå University of Technology. His research interests include machine learning, pattern recognition, artificial intelligence, human computer interaction, digital humanities, knowledge management, ubiquitous intuitive input devices, document analysis, and graph matching. From October 2009 to March 2010 he visited Kyushu University (Fukuoka, Japan) as a research fellow (visiting professor), supported by the Japanese Society for the Promotion of Science. In 2015, at the young age of 32, he received the ICDAR young investigator award, a bi-annual award acknowledging outstanding achievements in pattern recognition for researchers up to the age of 40.