
I'm Markus.
About

I am an Austrian Computer Science and Mathematics enthusiast.
I combine both of my passions by pursuing a PhD on Probabilistic Programming.
PhD Research
Probabilistic programming provides an intuitive means to specify probabilistic models as programs.
By enabling automatic Bayesian posterior inference, probabilistic programming systems allow the practitioners to focus on iterative modelling.
In my work, I focus on static analysis, compilation, and acceleration of probabilistic programs.
See the project homepage.
Master's Thesis
The goal of my Master's thesis was to build a Machine Learning model for the prediction of performance regressions in software projects. Using the openly available data of Mozilla's Firefox project, a novel labeling approach was found to intercept potential faulty commits before the would go to production.

Bachelor's Thesis
My Bachelor's thesis is about a particular approach to reinforcement learning, where the return is modelled directly by categorical distributions rather than only its expected value. I derived the sample complexity of tabular algorithms and considered the incorporation of risk measures in the action selection process. Results published in SIMODS.
Projects
UPIX: Universal Programmable Inference in JAX
Universal probabilistic programming languages (PPL) like Pyro or Gen enable the user to specify models with stochastic support. This means that control flow and array shapes are allowed to depend on the values sampled during execution. This is fundamentally incompatible with JIT-compilation in JAX. Thus, probabilistic programming systems built on top of JAX like NumPyro are restricted to models with static support, i.e. they disallow Python control flow. UPIX realises the Divide-Conquer-Combine (DCC) approach as a framework which brings back JIT-compilation for universal PPLs.
Strongly Solving Connect4
I generated the first win-draw-loss look-up table for 7x6 Connect4. This was achieved by symbolic search with binary decision diagrams.
TinyPPL: Educational implementation of a probabilistic programming language in Julia
To better understand the inner workings of PPLs and inference algorithms, I implement them from scratch without too many abstractions.
Predicting Chemical Properties with 3D Convolutional Neural Network
Describe a crystal molecule through Fourier transform in reciprocal space and predict enthalpy per atom.

Chess Engine
Programming a fast chess simulator and using MTD(f) as search algorithm.
Final engine includes a small opening book and self generated 3-men endgame tablebase.
It is on par with chess.com engines up to level 20 which they consider to be advanced to expert level strength.
Game as white vs chess.com computer level 16.
MNIST from Scratch
Implementation of automatic differentiation library and testing with deep neural network on MNIST dataset in JULIA.
CartPole with RL
Reinforcement Learning in continuous state space. The task was to swingup and balance a pendulum by only applying a horizontal force to the cart. The states were approximated by a neural net.

Goomo
The Goomo robot was product of a student project during my internship at iteratec. A Segway vehicle with Android smartphone on top allowed us to creatively try to write autonomous driving software.
Finite Field Generator
Ridiculously slow calculation of addition and multiplication table of any finite field
based on modular integer polynomial arithmetic.
Produces fancy pictures.
Personal Website
I created a website.
Contact
You can find some of the above projects on my GitHub and you can send me a message per e-mail.