In the broader context of teaching data science, members of this initiative founded, implemented, and are still managing the master program in Computational and Data Science that was introduced at FSU Jena in 2014. In the specific context of this initiative, we frequently teach the following courses that cover topics relevant for the design and efficient implementation of interactive inference algorithms.


Inference in Probabilistic Models (Joachim Giesen, Michael Habeck, and Wolfhart Feldmeier)

This course is offered in winter terms and serves as an introduction to probabilistic inference. It first provides a taxonomy of inference problems and their computational complexity, before it covers several exact, sampling-based (Markov Chain Monte Carlo), and optimization based (variational) inference algorithms.


Statistical Learning Theory (Joachim Giesen)

This course addresses the question under which conditions learning is possible at all. There is a theory, PAC (probably approximately correct ) learning, that can provide an answer to this question. The framework of PAC learning is supervised learning, in which one observes labels on features and the goal is to predict the label for a given non-necessarily observed feature. The course if offered every summer term.


Automatic Differentiation (Torsten Bosse and Martin Bücker)

Given a computer program, techniques of automatic differentiation use well-defined rules to mechanically generate another computer program to compute the input program's sensitivities. By taking Matlab as an illustrating example, this course introduces students to these program transformations and shows how to apply these techniques in practice.


Parallel Computing (Alexander Breuer and Martin Bücker)

The two courses Parallel Computing I (winter term) and II (summer term) give a gentle introduction to scalable algorithms and the efficient use of parallel systems. The students gain practical experiences in parallel programming for computers with distributed memory (MPI), shared memory (OpenMP), and graphical processing units (CUDA).


Algorithm Engineering (Mark Blacher)

This course covers the algorithm engineering cycle that consists of algorithm design, analysis, implementation and experimental evaluation, while considering aspects of modern hardware like caches, low-precision computations, vector and matrix instructions, or many-core processors, but also realistic assumptions about data. The course is offered every winter term.


Visualization (Kai Lawonn)

After a short introduction into the foundations of perception, this course covers topics from scientific visualization like vector field, flow, and volume visualization, and provides an introduction into selceted topics from information visualization. The course is offered in summer terms.


Efficient Machine Learning (Torsten Bosse, Alexander Breuer, and Martin Bücker)

This course gives an introduction to efficient and scalable techniques that are necessary to apply machine learning algorithms to large-scale problems. It involves practical programming assignments in PyTorch that examine how a production-ready machine learning framework really works. Topics include optimization algorithms, differentiation, custom extensions, distributed machine learning, and scientific machine learning.


Proof Complexity and SAT Solving (Olaf Beyersdorff)

The course serves as an introdcuton into proof complexity and algorithmic aspects of the satisifiability (SAT) problem. Specific topics are important proof systems, hard formulas for resolution, game theoreticial techniques for lower bounds, algorithms for special cases (Horn formulas, 2-SAT), DPLL and CDCL algorithms, the relationship between proof systems and SAT solvers, geometric and algebraic proof systems, Frege calculi, Quantified Boolean formulas, proof systems for modal logic, and local search algorithms. The course is offered in winter terms.