Blog Entry

Contributing to open source: FFMpeg and Blender


Background

I'm still a student at age 30. There was a gap between my starting a masters program in Computer Science and me graduating with a Bachelors of Science in Computer Science back in 2014-ish. The gap was me getting a job, and quitting soon after due to health reasons. And then I moved to another country (but that's a story for another day). In between working on courses at my own pace, I dabbled in many different interests. Sometimes attending game jams, sometimes making prototypes for a game(s) that never ended up getting finished. And sometimes contributing to an open source project (in this case, two).

Background for implementing a filter for FFMpeg

One of the last few courses I took in my undergrad years was a photography class (two of them actually), and I discovered that pictures taken from a DSLR may not be "corrected". The type of lens you use causes an aberration on the resulting raw image which some cameras correct for you. Otherwise, you would need to depend on software to correct the image. Take for example a fish-eye lens. With such a wide view, things in the corners of the image seem warped in the raw image. Applying correction results in an image that is usually no longer rectangular (and may lose quality closer to the edges) but is now corrected to be viewed as if there were no strange lens-based artifacts. You can see examples of this like in more recent automobiles where the side/rear-view camera views on the dashboard can be viewed in a "natural" way, only because lens correction was applied.

So with this lens correction problem in mind with photos I've taken with my DSLR, I wanted to correct images with FFMpeg (what I know as the swiss-army-knife of manipulating video/audio streams). I also use DarkTable, which is nice software that can also apply lens correction. The thing is, I noticed that DarkTable uses the lensfun library to apply lens correction, but FFMpeg had a different filter which was hard to figure out since each camera and their lens are different. Lensfun happens to have a database of camera lenses which is used to apply the right parameters to lens correction software to apply to your image. And that was when I had the idea of implementing a lensfun filter for FFMpeg.

Creating the filter for FFMpeg

FFMpeg has a github mirror, but development is not done on that code forge. FFMpeg uses a mailing list to submit and review patches to be included into FFMpeg's repository. I realized I had to get a set up working such that I could directly send patches with a git send-email ... to be reviewed since git formats it in a way for easier review. The second hurdle was that FFMpeg is primarily written in C. Well, not that bad of a hurdle for me since at the time, C++ was my preferred language of choice. So I examined other filters in FFMpeg's sources to see how a filter could be created, and worked onwards from there. It's always hard to work on another person's/group's code-base as a newbie to that code-base, so tools like cscope in vim helped tremendously in figuring out where functions/variables were defined, and where such functions/variables were used. When I got to a point where I felt familiar enough to write a filter, I went ahead and researched the API for lensfun.

Lensfun

It's been a while since I wrote the filter for FFMpeg, but I do remember having to implement a Gaussian function (I think it was Gaussian) for some reason. Lensfun provided the parameters to apply lens correction, but it did not provide all of the needed functionality, which was why the Gaussian implementation was required. So I had to wrangle with both FFMpeg and Lensfun. I did eventually figure it out enough such that I could actually apply lens correction to images taken from my DSLR.

Review for the Lensfun filter

I overlooked a couple things when I submitted my first version of my patch that implemented a FFMpeg filter for using lensfun. There was a linter script and a formatter specification (either it was a script that formatted the code, or config to pass to clang-format to format the code, I forget which). It took to about several revisions before the patch was finally approved. Unfortunately, Lensfun is GPL3, so builds of FFMpeg with GPL3 enabled could only use it. But it was a nice accomplishment.

Takeaways from developing in FFMpeg

I always refer to the docs, API references, and whatever other documentation provided by the project if they existed. I would avoid asking questions unless absolutely necessary (yeah I'm kind of shy). One of the most important things for working on someone else's code is that you have to familiarize yourself with the process for submitting code, the code format standards, and being acceptable that your patch is not up to snuff and requires more revisions.

Anyways, the commit that implemented the filter can be found here.

Blender is a video editor?

So I would often turn to Blender for video editing, though I really aught to learn 3D modeling since that is a useful thing to know. Looked around and noticed that Blender leverages FFMpeg for rendering video. I then remembered that AV1 support is chugging along well, and I thought to myself, "what if I set up Blender to render to AV1 video?"

AV1 not AVI

AV1 is a recently new video format that is meant to be completely free of any kind of fees to use due to patents or something, and it seems that support for it is improving as time goes on. It is not to be confused with the AVI container, and it is a video format/encoding (like H264).

Getting Blender to use AV1

Fortunately for me, Blender already leverages FFMpeg for decoding and encoding video (as mentioned earlier), so all that is needed is for some glue-code in Blender to use FFMpeg to encode to AV1 (and decoding would work automatically). The only issue is that Blender may not have the required dependencies to process AV1, but I later found out that someone else in charge of Blender dependencies would handle it after I submitted my patch. Once again, cscope in vim helped tremendously in figuring out what does what in the C/C++ code.

Of course, similarly to FFMpeg, Blender and their developers have their own process of submitting and reviewing patches for Blender. It was well documented on how to get started, so reading most of their available guides was a great help in figuring out how to submit a patch in the first place, as well as other guidelines for submission/writing-up-the-patch.

Things falling into place after the initial patch submission

After my first submission, other developers offered corrections, questions, and guidance for the patch. I had to set up a script that uses VMAF (a way to calculate a "video quality score" comparison from an original video and a processed video file) to make comparisons such that the selected codecs had the expected quality results from encoding. Long story short, libaom_av1 was the balanced/ideal codec. Also, the submission of my patch got the ball rolling for adding libaom_av1 as a dependency to Blender. So once that got done, my patch could be merged in.

Takeaways from developing in Blender

The process for submitting and reviewing code for Blender was different than the email/mailing-list process for FFMpeg. It was nice to have a website UI for review, but I did have to install "arcanist" (it uses php interestingly enough) on my machine to submit the patch in the first place (and I also needed to register on their developer website). Once again, it is important to figure out how Blender's developers get work done before submitting anything which is why it is important to read the relevant documentation and submission guides.

If you want to see my patch, it's here.

End notes

So contributing to open source requires a significant amount of studying a group's process before being able to help that groups' software. It also requires some knowledge/experience on being able to work on an unfamiliar code-base. Luckily for me, the code was primarily in C, and cscope + vim made it easier to grok. Well, some others probably wouldn't say code being in C to be a good thing, but I'm very familiar with C and C++ so that worked for me. The public nature of open source software and contributing open source software is neat. You just have to be competent enough to submit something that is of some use. Meritocracy or something like that I suppose. And you have to follow the rules of the project you're trying to submit to (for a more frictionless submission). It was an interesting experience, and if I have more ideas to implement in some open source software, I may try my hand again at submitting another patch. Time will tell...

Comments:

There are no comments for this post.