Skip to content

shivamshan/Neural-Style-Transfer

Repository files navigation

Neural-Style-Transfer

Neural style transfer is an optimization technique used to take two images—a content image and a style reference image (such as an artwork by a famous painter)—and blend them together so the output image looks like the content image, but “painted” in the style of the style reference image.

This style transfer implementation makes use of the VGG-19 model and its specific layers to extract high level features that can be compared to a randomly generated image into some image that has the content from the content image and style from the style image.

Initial Randomly Generated Image

randomly generated image

Content Image

Style Image

Intermediate Images

These images show how the intial random image develops into the final image over time

Final image

Comparison

The content and style factors can be adjusted by manipulating the alpha and beta parameters.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published