Photographic style transfer

Authors: Wang, L., Wang, Z., Yang, X., Hu, S.M. and Zhang, J.

Journal: Visual Computer

Volume: 36

Issue: 2

Pages: 317-331

ISSN: 0178-2789

DOI: 10.1007/s00371-018-1609-4

Abstract:

Image style transfer has attracted much attention in recent years. However, results produced by existing works still have lots of distortions. This paper investigates the CNN-based artistic style transfer work specifically and finds out the key reasons for distortion coming from twofold: the loss of spatial structures of content image during content-preserving process and unexpected geometric matching introduced by style transformation process. To tackle this problem, this paper proposes a novel approach consisting of a dual-stream deep convolution network as the loss network and edge-preserving filters as the style fusion model. Our key contribution is the introduction of an additional similarity loss function that constrains both the detail reconstruction and style transfer procedures. The qualitative evaluation shows that our approach successfully suppresses the distortions as well as obtains faithful stylized results compared to state-of-the-art methods.

https://eprints.bournemouth.ac.uk/31496/

Source: Scopus

Photographic style transfer

Authors: Wang, L., Wang, Z., Yang, X., Hu, S.-M. and Zhang, J.

Journal: VISUAL COMPUTER

Volume: 36

Issue: 2

Pages: 317-331

eISSN: 1432-2315

ISSN: 0178-2789

DOI: 10.1007/s00371-018-1609-4

https://eprints.bournemouth.ac.uk/31496/

Source: Web of Science (Lite)

Photographic style transfer

Authors: Wang, L., Wang, Z., Yang, X., Hu, S.M. and Zhang, J.

Journal: Visual Computer

Volume: 36

Issue: 2

Pages: 317-331

ISSN: 0178-2789

Abstract:

© 2018, The Author(s). Image style transfer has attracted much attention in recent years. However, results produced by existing works still have lots of distortions. This paper investigates the CNN-based artistic style transfer work specifically and finds out the key reasons for distortion coming from twofold: the loss of spatial structures of content image during content-preserving process and unexpected geometric matching introduced by style transformation process. To tackle this problem, this paper proposes a novel approach consisting of a dual-stream deep convolution network as the loss network and edge-preserving filters as the style fusion model. Our key contribution is the introduction of an additional similarity loss function that constrains both the detail reconstruction and style transfer procedures. The qualitative evaluation shows that our approach successfully suppresses the distortions as well as obtains faithful stylized results compared to state-of-the-art methods.

https://eprints.bournemouth.ac.uk/31496/

Source: BURO EPrints