Researchers have developed a method to adapt pretrained Softmax attention models to linear-complexity architectures using Test-Time Training (TTT). This approach addresses the representational gap between different attention mechanisms by focusing on architectural and representational alignment. The technique was applied to Stable Diffusion 3.5, resulting in a new model, SD3.5-T$^5$, which achieves comparable image quality with significantly faster inference speeds after only one hour of fine-tuning. AI
Summary written by None from 2 sources. How we write summaries →
IMPACT Accelerates inference for diffusion models by enabling efficient adaptation of pretrained weights to linear-complexity architectures.
RANK_REASON Academic paper detailing a new method for adapting existing models to different architectures.