Abstract
Neural style transfer has been developed in recent years, where both performance and efficiency have been greatly improved. However, most existing methods do not transfer the brushstrokes information of style images well. In this article, we address this issue by training a multi-granularity brushstrokes network based on a parallel coding structure. Specifically, we first adopt the content parsing module to obtain the spatial distribution of content image and the smoothness of different regions. Then, different brushstrokes features are transformed by a multi-granularity style-swap module guided by the region content map. Finally, the stylized features of the two branches are fused to enhance the stylized results. The multi-granularity brushstrokes network is jointly supervised by a new multi-layer brushstroke loss and pre-existing loss. The proposed method is close to the artistic drawing process. In addition, we can control whether the color of the stylized results tend to be the style image or the content image. Experimental results demonstrate the advantage of our proposed method compare with the existing schemes.
- [1] Wiki Art Gallery. 2011. A case for critical thinking. Issues Account. Edu. 26, 3 (2011), 593–608.Google Scholar
Cross Ref
- [2] . 2016. Fast patch-based style transfer of arbitrary style. Retrieved from arXiv:1612.04337.Google Scholar
- [3] . 2020. Arbitrary style transfer via multi-adaptation network. In Proceedings of the 28th ACM International Conference on Multimedia (MM’20). ACM, 2719–2727.Google Scholar
Digital Library
- [4] . 2016. Split and match: Example-based adaptive patch sampling for unsupervised style transfer. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’16). IEEE Computer Society, 553–561.Google Scholar
Cross Ref
- [5] . 2021. Fast accurate and automatic brushstroke extraction. ACM Trans. Multim. Comput. Commun. Appl. 17, 2 (2021), 44:1–44:24.Google Scholar
Digital Library
- [6] . 2019. High relief from brush painting. IEEE Trans. Vis. Comput. Graph. 25, 9 (2019), 2763–2776.Google Scholar
Cross Ref
- [7] . 2016. Image style transfer using convolutional neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’16). IEEE Computer Society, 2414–2423.Google Scholar
Cross Ref
- [8] . 2017. Controlling perceptual factors in neural style transfer. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’17). IEEE Computer Society, 3730–3738.Google Scholar
Cross Ref
- [9] . 2001. Image analogies. In Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH’01), (Ed.). ACM, 327–340.Google Scholar
Digital Library
- [10] . 2017. Arbitrary style transfer in real-time with adaptive instance normalization. In Proceedings of the IEEE International Conference on Computer Vision (ICCV’17). IEEE Computer Society, 1510–1519.Google Scholar
Cross Ref
- [11] . 2018. Stroke controllable fast style transfer with adaptive receptive fields. In Proceedings of the 15th European Conference on Computer Vision (ECCV’18) (Lecture Notes in Computer Science), Vol. 11217. Springer, 244–260.Google Scholar
Cross Ref
- [12] . 2016. Perceptual losses for real-time style transfer and super-resolution. In Proceedings of the 15th European Conference on Computer Vision (ECCV’16) (Lecture Notes in Computer Science), Vol. 9906. Springer, 694–711.Google Scholar
Cross Ref
- [13] . 2015. Adam: A method for stochastic optimization. In Proceedings of the 3rd International Conference on Learning Representations (ICLR’15), and (Eds.).Google Scholar
- [14] . 2011. Directional texture transfer with edge enhancement. Comput. Graph. 35, 1 (2011), 81–91.Google Scholar
Digital Library
- [15] . 2016. Combining Markov random fields and convolutional neural networks for image synthesis. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’16). IEEE Computer Society, 2479–2486.Google Scholar
Cross Ref
- [16] . 2017. Laplacian-steered neural style transfer. In Proceedings of the ACM on Multimedia Conference (MM’17). ACM, 1716–1724.Google Scholar
Digital Library
- [17] . 2019. Learning linear transformations for fast image and video style transfer. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’19). Computer Vision Foundation/IEEE, 3809–3817.Google Scholar
Cross Ref
- [18] . 2017. Universal style transfer via feature transforms. In Proceedings of the Annual Conference on Neural Information Processing Systems. 386–396.Google Scholar
Cross Ref
- [19] . 2017. Demystifying neural style transfer. In Proceedings of the 26th International Joint Conference on Artificial Intelligence (IJCAI’17), (Ed.). ijcai.org, 2230–2236.Google Scholar
Cross Ref
- [20] . 2014. Microsoft COCO: Common objects in context. In Proceedings of the 13th European Conference on Computer Vision (ECCV’14) (Lecture Notes in Computer Science), Vol. 8693. Springer, 740–755.Google Scholar
Cross Ref
- [21] . 2017. Depth-aware neural style transfer. In Proceedings of the 15th International Symposium on Non-Photorealistic Animation and Rendering ([email protected]’17). ACM, 4:1–4:10.Google Scholar
Digital Library
- [22] . 2019. Arbitrary style transfer with style-attentional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’19). Computer Vision Foundation/IEEE, 5880–5888.Google Scholar
Cross Ref
- [23] . 2016. Painting style transfer for head portraits using convolutional neural networks. ACM Trans. Graph. 35, 4 (2016), 129:1–129:18.Google Scholar
Digital Library
- [24] . 2017. Meta networks for neural style transfer. Retrieved from https://arXiv:1709.04111.Google Scholar
- [25] . 2018. Avatar-net: Multi-scale zero-shot style transfer by feature decoration. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’18). IEEE Computer Society, 8242–8250.Google Scholar
Cross Ref
- [26] . 2019. Increasing image memorability with neural style transfer. ACM Trans. Multim. Comput. Commun. Appl. 15, 2 (2019), 42:1–42:22.Google Scholar
Digital Library
- [27] . 2015. Very deep convolutional networks for large-scale image recognition. In Proceedings of the 3rd International Conference on Learning Representations (ICLR’15), and (Eds.). 1–14.Google Scholar
- [28] . 2016. Texture networks: Feed-forward synthesis of textures and stylized images. In Proceedings of the 33nd International Conference on Machine Learning (ICML’16) (JMLR Workshop and Conference Proceedings), and (Eds.), Vol. 48. JMLR.org, 1349–1357.Google Scholar
- [29] . Non-local neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’18). 7794–7803.Google Scholar
- [30] . 2017. Multimodal transfer: A hierarchical deep convolutional neural network for fast artistic style transfer. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’17). IEEE Computer Society, 7178–7186.Google Scholar
Cross Ref
- [31] . 2020. Diversified arbitrary style transfer via deep feature perturbation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR’20). IEEE, 7786–7795.Google Scholar
Cross Ref
- [32] . CBAM: Convolutional block attention module. In Proceedings of the 15th European Conference on Computer Vision (ECCV’18), , , , and (Eds.).Google Scholar
- [33] . 2018. Controlling stroke size in fast style transfer with recurrent convolutional neural network. Comput. Graph. Forum 37, 7 (2018), 97–107.Google Scholar
Cross Ref
- [34] . 2019. Attention-aware multi-stroke style transfer. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR’19). Computer Vision Foundation/IEEE, 1467–1475.Google Scholar
Cross Ref
- [35] . 2019. Multimodal style transfer via graph cuts. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV’19). IEEE, 5942–5950.Google Scholar
Cross Ref
Index Terms
Multi-granularity Brushstrokes Network for Universal Style Transfer
Recommendations
AesUST: Towards Aesthetic-Enhanced Universal Style Transfer
MM '22: Proceedings of the 30th ACM International Conference on MultimediaRecent studies have shown remarkable success in universal style transfer which transfers arbitrary visual styles to content images. However, existing approaches suffer from the aesthetic-unrealistic problem that introduces disharmonious patterns and ...
Conditional Fast Style Transfer Network
ICMR '17: Proceedings of the 2017 ACM on International Conference on Multimedia RetrievalIn this paper, we propose a conditional fast neural style transfer network. We extend the network proposed as a fast neural style transfer network by Johnson et al. [1] so that the network can learn multiple styles at the same time. To do that, we add a ...
Voxel-Based Three-Dimensional Neural Style Transfer
Advances in Computational IntelligenceAbstractNeural Style Transfer has been successfully applied in the creative process for generating novel artistic 2D images by transferring the style of a painting to an existing content image. These techniques which rely on deep neural networks have been ...






Comments