Courses


Current and past courses of the Programming Systems Department at the Institute for Computer Science at the Friedrich-Alexander University (FAU) in Erlangen-Nuremberg, Germany.

Enrollment starts in the summer term and ends at 30th September each year! Tip: Be quick!
Registration by E-Mail before the start of the seminar!
The topics are distributed on a first come, first served basis!


2021 – 2022


Machine Learning – Introduction (WS2020/21)
Bachelor (SemML-I) [5 ECTS]
Feigl T., Löffler C.
https://www.studon.fau.de/crs3111212.html
https://univis.fau.de

Content

This seminar introduces the topic of machine learning (ML). ML is the science of getting computers to act without being explicitly programmed. ML is so pervasive today that we probably use it every day without knowing it. For example, ML has enabled self-driving cars, practical image and speech recognition, and effective partner and web search possible in recent years.

The aim of the seminar is a comprehensive introduction to machine learning, analysis and processing of data as well as statistical pattern recognition. Topics include: (1) classification and regression problems; (2) supervised learning (parametric and non-parametric algorithms, linear and logistic regression, k-nearest neighbor, support vector machines, decision trees, shallow neural networks); (3) unsupervised learning (k-Means, clustering, dimensionality reduction, PCA, LDA, recommendation systems); (4) ensemble- and online-learning; (5) Regularization: model diagnosis, error analysis and quality metrics as well as interpretation of the results; (5) evolutionary algorithms; (6) anomaly detection and Gaussian distributions; (7) Bayes, Kalman filter, and Gaussian processes.¹

The seminar provides insights into the world of machine learning and enables the students to prepare a scientific presentation and written elaboration to convey individually acquired knowledge to a specialist audience.

¹ The topics are adapted to the current state of the art and alternate annually.

Literature

  • A. Müller and S. Guido: Introduction to Machine Learning with Python: A Guide for Data Scientists, O’Reilly UK Ltd., 2016
  • K. P. Murphy: Machine learning – a probabilistic perspective, Adaptive computation and machine learning series, MIT Press, 2012.
  • T. J. Hastie and R. Tibshirani und J. H. Friedman: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Springer Series in Statistics, 2009.
  • T. M. Mitchell: Machine Learning, McGraw-Hill Education Ltd., 1997
  • F. V. Jensen: An Introduction To Bayesian Networks, Springer, 1996
  • J. A. Freeman: Simulating neural networks – with Mathematica, Addison-Wesley Professional, 1993
  • J. A. Hertz and A. Krogh and R. G. Palmer: Introduction to the theory of neural computation, Westview Press, 1991
  • R. Rojas: Theorie der neuronalen Netze – eine systematische Einführung, Springer, 1993
  • W. Banzhaf and F. D. Francone and R. E. Keller and P. Nordin: Genetic programming – An Introduction: On the Automatic Evolution of Computer Programs and Its Applications, Morgan Kaufmann, 1998
  • M. Mitchell: An introduction to genetic algorithms, MIT Press, 1996
  • Z. Michalewicz: Genetic Algorithms + Data Structures = Evolution Programs, Springer, 1992
  • M. Bishop: Pattern Recognition and Machine Learning (Information Science and Statistics), Springer, 2006

Machine Learning – Advances (WS2020/21)
Master (SemML-II) [5 ECTS]
Feigl T., Löffler C.
https://www.studon.fau.de/crs3063931.html
https://univis.fau.de

Content

This seminar introduces the topic of deep learning. Deep learning is one of the most prominent skills in artificial intelligence. For example, deep learning methods have far exceeded all previous benchmarks for classifying images, text, and language. Deep learning enables and enhances some of the most interesting applications in the world, such as autonomous vehicles, genome research, humanoid robotics, real-time translation, and beats the best human Go players in the world.

The aim of the seminar is a comprehensive introduction to deep learning. Based on machine learning, it is therefore explained how deep learning works, when and why it is important and highlights the essential methods. Methods include: (1) network architectures and hyperparameters; (2) multi-layer perceptron; (3) mixtures of neural networks; (4) deep learning for sequences (hidden Markov models, recurrent neural networks, bidirectional / long-short-term-memory, gated recurrent unit, temporal convolutional network); (5) deep learning for images (convolutional neural networks); (6) deep / reinforcement learning; (7) Markov processes (Gaussian processes and Bayesian optimization, graphic models and Bayesian networks, Kalman and particle filters); (8) online learning and game theory; (9) unsupervised representation learning and generative methods (generative adversarial network, variational Autoencoder); (10) Data augmentation and transfer learning.¹

The seminar provides insights into the world of deep learning and enables the students to prepare a scientific presentation and written elaboration to convey individually acquired knowledge to a specialist audience.

¹ The topics are adapted to the current state of the art and alternate annually.

Literature

  • G. Goodfellow and Y. Bengio and A. C. Courville: Deep Learning, mitp-Verlag, 2015
  • R. S. Sutton and A. G. Barto: Reinforcement Learning: An Introduction, MIT Press, 1998
  • F. V. Jensen: An Introduction To Bayesian Networks, Springer, 1996
  • R. Rojas: Theorie der neuronalen Netze – eine systematische Einführung, Springer, 1993
  • J. Schmidhuber: Deep learning in neural networks: An overview, J. Intl. Neural Network Society (INNS), 2015
  • D. Silver et al.: Mastering the game of Go with deep neural networks and tree search, J. Nature, 2016
  • F. Chollet: Deep Learning with Python, Manning Publications, 2017
  • A. Müller and S. Guido: Introduction to Machine Learning with Python: A Guide for Data Scientists, O’Reilly UK Ltd., 2016
  • T. J. Hastie and R. Tibshirani and J. H. Friedman: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Springer Series in Statistics, 2009


Archive


2020 – 2021

Machine Learning – Introduction (WS2020/21)
Bachelor (SemML-I) [5 ECTS]
Feigl T., Löffler C.
https://www.studon.fau.de/crs3111212.html
https://univis.fau.de
Machine Learning – Advances (WS2020/21)
Master (SemML-II) [5 ECTS]
Feigl T., Löffler C.
https://www.studon.fau.de/crs3063931.html
https://univis.fau.de

2019 – 2020

Machine Learning (WS2019/20)
Bachelor (SemML-I) [2.5 ECTS]
Feigl T., Löffler C., Mutschler C.
https://www.studon.fau.de/crs2856618.html
Machine Learning (WS2019/20)
Master (SemML-II) [5 ECTS]
Feigl T., Löffler C., Mutschler C.
https://www.studon.fau.de/crs2856618.html


2018 – 2019

Machine Learning (WS2018/19)
Bachelor [2.5 ECTS]
Witt. N., Feigl T., Mutschler C.
https://univis.fau.de
Machine Learning (WS2018/19)
Master [5 ECTS]
Witt. N., Feigl T., Mutschler C.
https://univis.fau.de


2017 – 2018

Machine Learning 2.5 (WS2017/18)
Bachelor [2.5 ECTS]
Mutschler C., Witt. N., Feigl T.
https://univis.fau.de
Machine Learning (WS2017/18)
Master [5 ECTS]
Mutschler C., Witt. N., Feigl T.
https://univis.fau.de