MambaPainter: Neural Stroke-Based Rendering in a Single Step
Abstract: Stroke-based rendering is a technique that transforms an input image into an oil-painting style by predicting a sequence of brush strokes. However, conventional methods typically predict strokes one by one, resulting in slow processing and limited practicality. In this study, we propose MambaPainter, a novel method capable of predicting more than 100 brush strokes in a single inference step. By incorporating the selective state-space model, our approach enables significantly faster stylization. In addition, we introduce a patch-based strategy that allows efficient rendering of high-resolution images. Experimental results demonstrate that MambaPainter achieves oil-painting-style translation more efficiently than existing methods.
Authors: Tomoya Sawada, Marie Katsurai
Publication venue: SIGGRAPH Asia 2024 Posters
Demo



Code and Reference
https://github.com/STomoya/MambaPainter
Tomoya Sawada and Marie Katsurai. 2024. MambaPainter: Neural Stroke-Based Rendering in a Single Step. In SIGGRAPH Asia 2024 Posters (SA ’24). Association for Computing Machinery, New York, NY, USA, Article 98, 1–2. DOI: 10.1145/3681756.3697906.

