RoleML Researcher & Lecturer
AffiliationMSc candidate · FHTW
Based inVienna, AT
FocusVLMs · Looped transformers · Regularization
hey I'm

David Seyser. I build models
that think in loops.

Researcher & MSc candidate at FHTW, working on a from-scratch vision–language model, regularization for post-training, and recurrent transformers that reason by iteration instead of by scale.

Looking for
Full-time ML research or engineeringIndustry research labs, applied science, or small focused teams.
Start
After my MSc wraps 2026FHTW, Vienna. Thesis on looped language models.
Based in
Austria · open to remote EU or relocation
Stack
PyTorch · JAX · CUDA · Python · C++
usually reply in under 48h
research

Three things I'm pushing on.

From scratch · not fine-tunes
01

IRIS

ActiveFrom scratchVision + Language

A vision–language model built around Ouro.

500MParams
4Training runs
2026Release target
02

LoopLMs

FlagshipArchitecturePaper in draft

Recurrent transformers that can think for longer when a problem is harder. Same parameters, variable depth, a small model that decides to compute more. My bet on what comes after raw scale.

Effective depth
−2.1Loss vs. baseline
2026Paper target
03

ReSiReg

In ProgressPost-trainingCoRL 2026

-
-
CoRLIn Progress
shipped

What I did.

Papers · models · teaching
2025
SOMP Stiffness Optimized Motion Planning
Workshop · peer-reviewed
Published
2026
ReSiReg regularization for stable post-training
CoRL
Under review
2026
IRIS
HuggingFace model
Published
SS '26
Course · Moderne Verfahren zur Sensorbasierten Roboteregelung
FHTW · MSc · 12 weeks
Teaching
WS '25
Course · Robot Based Manufacturing
FHTW · MSc · 14 weeks
Teaching
say hi

Let's actually
build tomorrow.

Hiring, collaborating, or just want to nerd out about loops, email's fastest. I read everything.

© 2026 David Seyser · Vienna, AT
Today for a better tomorrow.