“Science is not a collection of facts, but a constant struggle with […] nature.” – Judea Pearl
Thanks for your interest in my research.
I am a statistician working on reliable machine learning. In particular, I try to incorporate epistemic modeling uncertainty into adaptive machine learning methods such as self-training, superset learning or Bayesian optimization.
Research Interests
- All things Bayesian
- Imprecise Probabilities
- Generalized Bayes
- Imprecise Gaussian Processes
- Optimization
- Weak Supervision
- Sampling Theory
- … 1
Publications
-
Julian Rodemann, Christoph Jansen, Georg Schollmeyer (2024): Reciprocal Learning. Neural Information Processing Systems NeurIPS 2024, forthcoming.
-
Christoph Jansen2, Georg Schollmeyer2, Julian Rodemann2, Hannah Blocher2, Thomas Augustin (2024): Statistical Multicriteria Benchmarking via the GSD-Front. Neural Information Processing Systems NeurIPS (Spotlight) 2024, forthcoming.
-
Esteban Garces Arias, Julian Rodemann, Meimingwei Li, Christian Heumann, Matthias Aßenmacher (2024): Adaptive Contrastive Search: Uncertainty-Guided Decoding for Open-Ended Text Generation. Conference on Empirical Methods in Natural Language Processing EMNLP Findings, Miami, USA, forthcoming.
-
Julian Rodemann, Thomas Augustin (2024): Imprecise Bayesian Optimization. Knowledge-Based Systems 2024, Elsevier.
-
Stefan Dietrich, Julian Rodemann, Christoph Jansen (2024): Semi-Supervised Learning guided by the Generalized Bayes Rule under Soft Revision. 11th International Conference on Soft Methods in Probability and Statistics 2024 SMPS, Salzburg, Austria.
-
Julian Rodemann (2024): Towards Bayesian Data Selection 5th Workshop on Data-Centric Machine Learning Research DMLR at ICML 2024.
-
Julian Rodemann2, Hannah Blocher2 (2024): Partial Rankings of Optimizers. International Conference on Learning Representations ICLR 2024, Tiny Papers Track, Vienna, Austria.
-
Malte Nalenz, Julian Rodemann, Thomas Augustin (2024): Learning De-Biased Regression Trees and Forests from Complex Samples. Machine Learning, Springer.
-
Julian Rodemann (2023): Pseudo-Label Selection Is a Decision Problem (Short Paper). 46th German Conference on Artificial Intelligence KI, Berlin, Germany. LNAI, Springer.
-
Julian Rodemann, Jann Goschenhofer, Emilio Dorigatti, Thomas Nagler, Thomas Augustin (2023): Approximately Bayes-Optimal Pseudo-Label Selection. 39th Conference on Uncertainty in Artificial Intelligence UAI, Pittsburgh, USA. PMLR.
-
Christoph Jansen, Georg Schollmeyer, Hannah Blocher, Julian Rodemann, Thomas Augustin (2023): Robust Statistical Comparison of Random Variables with Locally Varying Scale of Measurement. 39th Conference on Uncertainty in Artificial Intelligence UAI, Pittsburgh, USA. PMLR.
-
Julian Rodemann, Christoph Jansen, Georg Schollmeyer, Thomas Augustin (2023): In All Likelihood(s): Robust Selection of Pseudo-Labeled Data. 13th International Symposium on Imprecise Probabilities: Theories and Applications ISIPTA, Oviedo, Spain. PMLR.
-
Julian Rodemann, Dominik Kreiss, Eyke Hüllermeier, Thomas Augustin (2022): Levelwise Data Disambiguation by Cautious Superset Classification. 15th International Conference on Scalable Uncertainty Management SUM, Paris, France. LNAI, Springer.
-
Julian Rodemann, Thomas Augustin (2022): Accounting for Gaussian Process Imprecision in Bayesian Optimization. 3 9th International Symposium on Integrated Uncertainty in Knowledge Modelling and Decision Making IUKM, Ishikawa, Japan. LNAI, Springer.
Some Talks and Posters
-
Julian Rodemann, Thomas Augustin, Rianne de Heide (2023): Interpreting Generalized Bayesian Inference by Generalized Bayesian Inference. ISIPTA 2023
-
Alexander Marquardt, Julian Rodemann, Thomas Augustin (2023): An Empirical Study of Prior-Data Conflicts in Bayesian Neural Networks. ISIPTA 2023.
-
Julian Rodemann, Jann Goschenhofer, Emilio Dorigatti, Thomas Nagler, Thomas Augustin (2023): Approximately Bayes-Optimal Pseudo Label Selection. Poster accepted for the Fifth Symposium on Advances in Approximate Bayesian Inference AABI, co-located with ICML 2023.
-
Julian Rodemann (2023): Learning Under Weak Supervision: Insights from Decision Theory. Young Statistician’s Lecture Series (YSLS). International Biometric Society (IBS) Early Career Working Group, Germany.
-
Malte Nalenz, Julian Rodemann, Thomas Augustin (2022): De-biased Regression Trees for Complex Data. Statistical Week by the German Statistical Society (DStatG) in Münster, Germany.
-
Julian Rodemann, Sebastian Fischer, Malte Nalenz, Lennart Schneider, Thomas Augustin (2022): Not All Data Are Created Equal - Lessons from Sampling Theory for Adaptive Machine Learning. Poster presented at ICSDS 2022 hosted by the Institute of Mathematical Statistics (IMS).
-
Julian Rodemann (2022): Prior-RObust Bayesian Optimization (PROBO). DAGStat Joint Statistical Meeting in Hamburg, Germany.
-
Julian Rodemann, Thomas Augustin (2022): A Deep Dive Into BO Sensitivity and PROBO. Young Statistician’s Lecture Series (YSLS). IBS Early Career Working Group, Germany.
-
Julian Rodemann, Thomas Augustin (2021): Accounting for Imprecision of Model Specification in Bayesian Optimization. Poster presented at ISIPTA, Granada, Spain.
-
Julian Rodemann (2017): Clustering lifecycles. Villigst Machine Learning Undergraduate Workshop hosted by Max-Planck-Institute for Intelligent Systems (MPI-IS) in Tübingen, Germany.