Skip to content

architmang/style-transfer

Repository files navigation

Style Transfer

This notebook implements the style transfer technique from "Image Style Transfer Using Convolutional Neural Networks" (Gatys et al., CVPR 2015).

The general idea is to take two images, and produce a new image that reflects the content of one but the artistic "style" of the other. This is done by first formulating a loss function that matches the content and style of each respective image in the feature space of a deep network, and then performing gradient descent on the pixels of the image itself.

The deep network used as a feature extractor is SqueezeNet, a small model that has been trained on ImageNet. Any network can be used, but SqueezeNet was chosen here for its small size and efficiency.

The implementation is done in Pytorch and the notebook was run in a colab notebook.

Image produced by this method:

caption

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published