Neural Style Transfer - With the emergence of deep perceptual image features, style transfer has become a popular application that repaints a picture while preserving the geometric patterns and textures from a sample image. Our work is devoted to the combination of perceptual features from multiple style images, taken at different scales, e.g. to mix large-scale structures of a style image with fine-scale textures. Surprisingly, this turns out to be difficult, as most deep neural representations are learned to be robust to scale modifications, so that large structures tend to be tangled with smaller scales. Here a multi-scale convolutional architecture is proposed for bi-scale style transfer. Our solution is based on a modular auto-encoder composed of two lightweight modules that are trained independently to transfer style at specific scales, with control over styles and colors.
Authored by Thibault Durand, Julien Rabin, David Tschumperle
Neural Style Transfer - With the development of economical society, the problem of product piracy security is becoming more and more serious. In order to protect the copyright of brands, based on the image neural style transfer, this paper proposes an automatic generation algorithm of anti-counterfeiting logo with security shading, which increases the difficulty of illegal copying and packaging production. VGG19 deep neural network is used to extract image features and calculate content response loss and style response loss. Based on the original neural style transfer algorithm, the content loss is added, and the generated security shading is fused with the original binary logo image to generate the anti-counterfeiting logo image with higher recognition rate. In this paper, the global loss function is composed of content loss, content response loss and style response loss. The L-BFGS optimization algorithm is used to iteratively reduce the global loss function, and the relationship between the weight adjustment, the number of iterations and the generated anti-counterfeiting logo among the three losses is studied. The secret keeping of shading style image used in this method increases the anti-attack ability of the algorithm. The experimental results show that, compared with the original logo, this method can generate the distinguishable logo content, complex security shading, and has convergence and withstand the attacks.
Authored by Zhenjie Bao, Chaoyang Liu, Jinqi Chen, Jinwei Su, Yujiao Cao
Neural Style Transfer - As one of the fields of computer art creation, style transfer has become more and more popular. However, in order to obtain good visual effects, a large number of neural style transfer algorithms use semantic map to guide the style transfer between the correct regions. As an important means to ensure the quality of style transfer, semantic map can meaningfully control the results of style transfer. However, the method of manually generating semantic graph is cumbersome and inefficient. In this paper, we introduce a semantic segmentation network to automatically generate the semantic map required by neural style transfer, and combine it with neural style transfer network, we propose a new neural style transfer algorithm. Experiments show that our algorithm not only avoids cumbersome manual work, but also generates high-quality style transfer results.
Authored by ChangMing Wu, Min Yao
Neural Style Transfer - Style transfer is an optimizing technique that aims to blend style of input image to content image. Deep neural networks have previously surpassed humans in tasks such as object identification and detection. Deep neural networks, on the contrary, had been lagging behind in generating higher quality creative products until lately. This article introduces deep-learning techniques, which are vital in accomplishing human characteristics and open up a new world of prospects. The system employs a pre-trained CNN so that the styles of the provided image is transferred to the content image to generate high quality stylized image. The designed systems effectiveness is evaluated based on Mean Square Error (MSE), Peak Signal to Noise Ratio (PSNR) and Structural Similarity Index Metrics (SSIM), it is noticed that the designed method effectively maintains the structural and textural information of the cover image.
Authored by Kishor Bhangale, Pranoti Desai, Saloni Banne, Utkarsh Rajput
Neural Style Transfer - Arbitrary image style transfer is a challenging task which aims to stylize a content image conditioned on arbitrary style images. In this task the feature-level content-style transformation plays a vital role for proper fusion of features. Existing feature transformation algorithms often suffer from loss of content or style details, non-natural stroke patterns, and unstable training. To mitigate these issues, this paper proposes a new feature-level style transformation technique, named Style Projection, for parameter-free, fast, and effective content-style transformation. This paper further presents a real-time feed-forward model to leverage Style Projection for arbitrary image style transfer, which includes a regularization term for matching the semantics between input contents and stylized outputs. Extensive qualitative analysis, quantitative evaluation, and user study have demonstrated the effectiveness and efficiency of the proposed methods.
Authored by Siyu Huang, Haoyi Xiong, Tianyang Wang, Bihan Wen, Qingzhong Wang, Zeyu Chen, Jun Huan, Dejing Dou
Neural Style Transfer - Deep learning has shown promising results in several computer vision applications, such as style transfer applications. Style transfer aims at generating a new image by combining the content of one image with the style and color palette of another image. When applying style transfer to a 4D Light Field (LF) that represents the same scene from different angular perspectives, new challenges and requirements are involved. While the visually appealing quality of the stylized image is an important criterion in 2D images, cross-view consistency is essential in 4D LFs. Moreover, the need for large datasets to train new robust models arises as another challenge due to the limited LF datasets that are currently available. In this paper, a neural style transfer approach is used, along with a robust propagation based on over-segmentation, to stylize 4D LFs. Experimental results show that the proposed solution outperforms the state-ofthe-art without any need for training or fine-tuning existing ones while maintaining consistency across LF views.
Authored by Maryam Hamad, Caroline Conti, Paulo Nunes, Luis Soares
Neural Style Transfer - Text style transfer is a relevant task, contributing to theoretical and practical advancement in several areas, especially when working with non-parallel data. The concept behind nonparallel style transfer is to change a specific dimension of the sentence while retaining the overall context. Previous work used adversarial learning to perform such a task. Although it was not initially created to work with textual data, it proved very effective. Most of the previous work has focused on developing algorithms capable of transferring between binary styles, with limited generalization capabilities and limited applications. This work proposes a framework capable of working with multiple styles and improving content retention (BLEU) after a transfer. The proposed framework combines supervised learning of latent spaces and their separation within the architecture. The results suggest that the proposed framework improves content retention in multi-style scenarios while maintaining accuracy comparable to state-of-the-art.
Authored by Lorenzo Vecchi, Eliane Maffezzolli, Emerson Paraiso
Neural Style Transfer - Reducing inter-subject variability between new users and the measured source subjects, and effectively using the information of classification models trained by source subject data, is very important for human–machine interfaces. In this study, we propose a style transfer mapping (STM) and fine-tuning (FT) subject transfer framework using convolutional neural networks (CNNs). To evaluate the performance, we used two types of public surface electromyogram datasets named MyoDatasets and NinaPro database 5. Our proposed framework, STM-FT-CNN, showed the best performances in all cases compared with conventional subject transfer frameworks. In the future, we will build an online processing system that includes this subject transfer framework and verify its performance in online experiments.
Authored by Suguru Kanoga, Takayuki Hoshino, Mitsunori Tada
Neural Style Transfer - Image style transfer is an important research content related to image processing in computer vision. Compared with traditional artificial computing methods, deep learning-based convolutional neural networks in the field of machine learning have powerful advantages. This new method has high computational efficiency and a good style transfer effect. To further improve the quality and efficiency of image style transfer, the pre-trained VGG-16 neural network model and VGG-19 neural network model are used to achieve image style transfer, and the transferred images generated by the two neural networks are compared. The research results show that the use of the VGG-16 convolutional neural network to achieve image style transfer is better and more efficient.
Authored by Yilin Tao