Two Minute Papers goes over the paper “#GANILLA: Generative Adversarial Networks for Image to Illustration Translation” in this video.
Ben Marriott uses the style-transfer software EbSynth on animated footage to see if you really can change the style using a single reference frame.
EbSynth is technically not an artificial intelligence software, It uses style transfer to get its results. I put AI in the title and thumbnail because it does what most people would think of when we talk about this type of AI in video… also for views.
You can get EbSynth here: https://ebsynth.com/
Two Minute Papers covers this breakthrough discovery presented in the paper “Stylizing Video by Example”
Two Minute Papers explores the paper “Fast Example-Based Stylization with Local Guidance” in this video.
Here’s a great tutorial that uses deep learning to compose one image in the style of another image. If you’ve ever wished that you could paint like Picasso or Van Gogh. then this AI technique is your big chance.
Known as neural style transfer and the technique is outlined in A Neural Algorithm of Artistic Style, you can do this today with TensorFlow.
Neural style transfer is an optimization technique used to take two images—a content image and a style reference image (such as an artwork by a famous painter)—and blend them together so the output image looks like the content image, but “painted” in the style of the style reference image.