Skip to content
This repository has been archived by the owner on Oct 9, 2023. It is now read-only.

Latest commit

 

History

History
64 lines (44 loc) · 2.3 KB

style_transfer.rst

File metadata and controls

64 lines (44 loc) · 2.3 KB

Style transfer is currently in Beta.

Style Transfer

The Task

The Neural Style Transfer Task is an optimization method which extract the style from an image and apply it another image while preserving its content. The goal is that the output image looks like the content image, but “painted” in the style of the style reference image.

style_transfer_example

The ~flash.image.style_transfer.model.StyleTransfer and ~flash.image.style_transfer.data.StyleTransferData classes internally rely on pystiche.


Example

Let's look at transferring the style from The Starry Night onto the images from the COCO 128 data set from the object_detection Guide. Once we've downloaded the data using ~flash.core.data.download_data, we create the ~flash.image.style_transfer.data.StyleTransferData. Next, we create our ~flash.image.style_transfer.model.StyleTransfer task with the desired style image and fit on the COCO 128 images. We then use the trained ~flash.image.style_transfer.model.StyleTransfer for inference. Finally, we save the model. Here's the full example:

../../../flash_examples/style_transfer.py

To learn how to view the available backbones / heads for this task, see backbones_heads.


Flash Zero

The style transfer task can be used directly from the command line with zero code using flash_zero. You can run the above example with:

flash style_transfer

To view configuration options and options for running the style transfer task with your own data, use:

flash style_transfer --help