Parameter-Free Style Projection for Arbitrary Image Style Transfer | |
---|---|
Author | |
Abstract |
Neural Style Transfer - Arbitrary image style transfer is a challenging task which aims to stylize a content image conditioned on arbitrary style images. In this task the feature-level content-style transformation plays a vital role for proper fusion of features. Existing feature transformation algorithms often suffer from loss of content or style details, non-natural stroke patterns, and unstable training. To mitigate these issues, this paper proposes a new feature-level style transformation technique, named Style Projection, for parameter-free, fast, and effective content-style transformation. This paper further presents a real-time feed-forward model to leverage Style Projection for arbitrary image style transfer, which includes a regularization term for matching the semantics between input contents and stylized outputs. Extensive qualitative analysis, quantitative evaluation, and user study have demonstrated the effectiveness and efficiency of the proposed methods. |
Year of Publication |
2022
|
Date Published |
may
|
Publisher |
IEEE
|
Conference Location |
Singapore, Singapore
|
ISBN Number |
978-1-66540-540-9
|
URL |
https://ieeexplore.ieee.org/document/9746290/
|
DOI |
10.1109/ICASSP43922.2022.9746290
|
Google Scholar | BibTeX | DOI |