Style transfer is currently in Beta.
The Neural Style Transfer Task is an optimization method which extract the style from an image and apply it another image while preserving its content. The goal is that the output image looks like the content image, but “painted” in the style of the style reference image.
The ~flash.image.style_transfer.model.StyleTransfer
and ~flash.image.style_transfer.data.StyleTransferData
classes internally rely on pystiche.
Let's look at transferring the style from The Starry Night onto the images from the COCO 128 data set from the object_detection
Guide. Once we've downloaded the data using ~flash.core.data.download_data
, we create the ~flash.image.style_transfer.data.StyleTransferData
. Next, we create our ~flash.image.style_transfer.model.StyleTransfer
task with the desired style image and fit on the COCO 128 images. We then use the trained ~flash.image.style_transfer.model.StyleTransfer
for inference. Finally, we save the model. Here's the full example:
../../../flash_examples/style_transfer.py
To learn how to view the available backbones / heads for this task, see backbones_heads
.
The style transfer task can be used directly from the command line with zero code using flash_zero
. You can run the above example with:
flash style_transfer
To view configuration options and options for running the style transfer task with your own data, use:
flash style_transfer --help