#DailyBloggingChallenge (262/300)
While looking back at the blog post on how to create a #GIF via #Gimp (<https://bf5.eu/post/how-to-make-a-gif-with-gimp/>) in regards to make #beforeAndAfter animations for #OpenStreetMap, I noticed that there are a couple of things that irk me that I have ignored previously.
These things include animation quality and process time.
#DailyBloggingChallenge (263/300)
The #Gimp guide still has its place when dealing with images that are minorly misaligned. This is because on will need to adjust the images in such a manner that they overlap on top of each other.
This practice can be extended to the point that once a set of standardized images exist, then the ones that are misaligned can be filled and cropped to align with the standard.
#DailyBloggingChallenge(266/300)
The #Gimp guide suggests to export the animation as a #GIF with a 1 frames per second.
This approach is fine if one is dealing with images that use less than 256 colors. Otherwise one will notice some compression artefacts.
#DailyBloggingChallenge(267/300)
#Gimp also permits to export the animation as a #WEBP. This will nullify the artefacts issue introduced by #GIF.
The only issue is if one wants to upload the result to the #Fediverse the animation won't be registered.
#DailyBloggingChallenge(268/300)
Not only do #WEBP have the animation halt in the #Fediverse though so do #GIF.
To overcome this issue, one can convert the specific file extension to a MP4.
For GIF one can run via #FFMPEG this
```
ffmpeg -i input.gif output.mp4
```
Sadly, currently the same cannot be done with WEBP, since [decoding animated WEBP](https://trac.ffmpeg.org/ticket/4907) has not yet been implemented.
#DailyBloggingChallenge(269/300)
One option to handle the #WEBP case is to follow this [guide](https://superuser.com/a/1688890). There are some modification that should be taken.
First extract the frames using #imagemagick
```
magick input.webp frames.png
```
Then use #FFMPEG to build a MP4 video from the extracted frames
```
ffmpeg -r 1 -i frames-%0d.png -c:v libx264 output.mp4
```
#DailyBloggingChallenge(270/300)
The issue with this approach is that when extracting the frames via #ImageMagick one will only get the changes and the rest will be the alpha channel.
Thus when building the MP4 with #FFMPEG won't look as expected.
One could go back in #Gimp and merge each frame to the base layer (the first frame). Or just use the reference images.
#DailyBloggingChallenge(272/300)
One nice thing that #Gimp provides is the option to crop all layers at once. This can also be done with #FFMPEG to crop the _output.mp4_.
Just follow this guide: [How can I crop a video with ffmpeg](https://video.stackexchange.com/a/4571)
Gimp or any other image editor app can be used to get a reference area for the crop area.
Hypothetically one combine step [265](https://qoto.org/@barefootstache/112319462127053043) and this one in one command.