A generative model learns to reconstruct biological sequences by iteratively unmasking tokens. This demo visualises the process on a simple image. The same principle applies to DNA, where each pixel becomes a nucleotide (A, C, G, T).
Forward process: progressively masks tokens until fully corrupted.
Reverse process: a neural network predicts and unmasks tokens iteratively,
generating new sequences that follow the learned grammar.
Interactive companion to poster, Rotation Project 2, 2026