DEV Community

Farai Gandiya
Farai Gandiya

Posted on • Originally published at codelab.farai.xyz on

Converting Video into Animated Images Using FFmpeg

Converting Video into Animated Images Using FFmpeg was first published on Farai's Codelab.


This post has been republished with new information.


This post will supplement an upcoming post on why you should stop using GIFs in favor of newer image and video formats. Here’s how to generate animated images in various media formats using FFmpeg.

Preparing The Source Clip For Further Conversion

First thing to do is convert it to an uncompressed y4m video, slicing it up and setting the framerate as necessary:

ffmpeg -ss <start_point> -t <duration_from_start> -i <source_media> -an -vf 'scale=<width>:<height>,setpts=<stretch_factor>*PTS,fps=<framerate>' -pix_fmt yuv420p <raw_input>.y4m

Enter fullscreen mode Exit fullscreen mode

What the option flags mean:

Option Description
-ss Marks the start position of the video stream as a time duration
-t Specifies the duration of the datastream from -ss (or 0) as a time duration. Use both -ss and -t before -i to limit the video input.
-an Removes audio
-vf The video filters
scale Sets the w idth and h eight of the video
setpts Sets the presentation timestamps (PTS) for the video. Used to speed up and slow down video
fps Specifies the framerate for the video.
pix_fmt Sets the color format. Necessary if you’re converting from GIFs.

Note, if you are going to set a framerate for a GIF, it has a delay between frames in hundreths of a second (fps=100/delay). So:

Delay Framerate
1 10pfs
2 50fps
3 33fps
4 25fps
5 20fps
6 15fps

Images

GIF

ffmpeg -i <source_input>.y4m -filter_complex "[0:v] split [a][b];[a] palettegen [p];[b][p] paletteuse" -loop 0 <output>.gif

Enter fullscreen mode Exit fullscreen mode

Simply put the -filter_complex flag in this case generates a color palette to use in the GIF. See GIPHY’s Engineering blog on how to make GIFs to explain the flag.

-loop 0 makes it loop forever.

You can leave it out, resulting in a much smaller GIF (at the expense of quality).

WebP

ffmpeg -i <raw_input>.y4m -loop 0 -q:v 100 -compression_level 6 <output>.webp

Enter fullscreen mode Exit fullscreen mode

q:v is the image quality, from 0 to 100. It’ll be much smaller, but WebP can get blocky if you push the quality too hard.

Like GIF, -loop 0 makes it loop forever.

Sequenced AVIF (Chrome For Now)

ffmpeg doesn’t support writing to AVIF containers when I wrote this, so you nee to use avifenc to do this.

avifenc <raw_input>.y4m <output>.avif

Enter fullscreen mode Exit fullscreen mode

Videos

From here on out, I won’t go over the options too much since there’s a lot to consider, most of which I don’t understand so I’ll link to the respective encoding guide if you want more options.

AVC/h.264 MP4 (widest support)

ffmpeg -i <raw_input>.y4m -c:v libx264 -preset veryslow <output>.mp4

Enter fullscreen mode Exit fullscreen mode

The preset flag tries to find the best tradeoff between quality and compression at a given bitrate and file size.

H.264 Encoding Guide

HEVC/h.265 (WebKit)

ffmpeg -i <raw_input>.y4m -c:v libx265 -preset veryslow -tag:v hvc1 <output>.mp4

Enter fullscreen mode Exit fullscreen mode

You need the -tag:v hvc1 for the video to play in Safari. Thanks Aaron!.

preset is the same as in AVC.

H.265 Encoding Guide

VP8/VP9 (Not WebKit)

Switch the vp8 for vp9 depending on which one you choose. vp9 is newer so better, although it doesn’t have the support of vp8 (although it’s reasonable).

ffmpeg -i <raw_input>.y4m -c:v vp9 <output>.webm

Enter fullscreen mode Exit fullscreen mode

VP8 Encoding Guide/VP9 Encoding Guide

AV1 Video (Chrome and Firefox)

Update 21 May 2021: When I initially wrote this post, my build of ffmpeg didn’t include different av1 encoders so I tried to encode AV1 videos using the encoders directly. As I was updating this post to add instructions on how them, I noticed that the alternatate encoders are now built in the standard build of ffmpeg.

Encoding AV1 will use a lot of CPU and it takes longer than the others.

ffmpeg -i <raw_input>.y4m -c:v libaom-av1 <output>.webm

Enter fullscreen mode Exit fullscreen mode

If you’re on an older version of ffmpeg, you need to add -strict -2. Note that you can also use rav1e and SVT-AV1 in FFmpeg to encode AV1 videos.

AV1 Encoding Guide.

aomenc (Reference Encoder)

You’ll either have to compile aomenc yourself or you can use aomenc’s Windows builds by Marco Sousa.

aomenc.exe -o <output_file>.webm <raw_input>.y4m

Enter fullscreen mode Exit fullscreen mode

rav1e (Fastest)

You can find rav1e’s builds on GitHub. Note that rav1e only generates the raw video stream (saved as an ivf file) which needs to be muxed into a webm container file.

rav1e.exe <raw_input>.y4m -o <intermediate>.ivf
ffmpeg -i <intermediate>.ivf-c:v copy <output>.webm -hide_banner -loglevel error

Enter fullscreen mode Exit fullscreen mode

SVT-AV1

While there are some SVT-AV1 builds on GitHub, the project is marked as archived and SVT-AV1’s Gitlab Repo doesn’t have any prebuilt binaries so you’ll have to compile it yourself. However you get the binary, like rav1e, you need to mux the raw video stream into a webm container file.

SvtAv1EncApp.exe -b <intermediate>.ivf -i <raw_input>.y4m
ffmpeg -i $destIvf -c:v copy <output>.webm

Enter fullscreen mode Exit fullscreen mode

Thanks for reading! If you liked this post, consider supporting my work by:

You can also subscribe to my newsletter.

Have feedback? Send an email to gandiyafarai+feedback at gmail dot com

Top comments (1)

Collapse
 
simonojha profile image
Simon

Hey, I need help with ffmpeg for merging multiple images and an audio file into a single video.
I am using nextjs. And the process of merging will happen locally on the browser.
Can you help me with it?
Here is the link for the Stack Overflow question:
stackoverflow.com/questions/777499...