Integration of OpenGL Filter into FFmpeg on Android — A Couple of HowTos

Rosberry
@RosberryApps
Published in
7 min readJun 30, 2021

--

By Yevgeny Nagibin, Android Developer at Rosberry

My name is Yevgeny and I’ve been developing Android apps at Rosberry for seven years. We’ve built more than a dozen native iOS and Android photo and video processing apps with millions of installs, so managing and dealing with media files have become one of the favourite areas of expertise for our team. In this article I want to share my experience on creating filters and video effects for Filmm, a popular video editing app. Its first Android version was published by us on Google Play last year, the one for iOS even earlier. The target audience of the app are active social media users in need of high-quality personalized video content with various filters, effects and textures applied.

At the technical level, to add effects to the Android-based application we used OpenGL shaders, and rendering was made with the help of GLSurfaceView. But to render a filter on screen was actually only a part of the job. Ultimately, we had to convert a rendered video into a final video file so that a user could share it with friends or simply store it on their device. But here comes an important question: how can we record everything that we see on the screen into a separate video file? Let’s talk about it in more detail.

OpenGL ES

It’s no secret that Android uses OpenGL ES (OpenGL for Embedded Systems) to render 2D and 3D graphics. This is a subset of the OpenGL standard specification for the embedded devices.

I will not dive into the details on how we added various filters and video effects within OpenGL implementation —that’s too long a conversation for one lunch. But simply put— we had to draw a video effect over the original video, which turned out to be a very difficult task. Let me just say that in the end we had a pretty large OpenGL shader, which was used to both combine the filters and effects, and display a preview of the applied filter or effect. So this huge shader worked perfectly well on different devices, but as I’ve already said above we had one main question to answer: how could we get a filter, effect and a video itself to be recorded into a video file? The main functionality of the app is not only about editing a video but also about producing a separate video file that can be shared.

Rendering to a Video File

We started thinking about the options for implementing such functionality. After a short research on this topic, we realized that we would have three options:

  1. MediaRecorder
  2. MediaCodec
  3. FFmpeg + OpenGL

We used GLSurfaceView to render OpenGL in our app because it’s one of the methods that Android approves of. We came up with the idea to simply render the current state of GLSurfaceView to a video file using the MediaRecorder API — which is used to encode the GLSurfaceView content into a video file, but we came to face several issues here:

  1. Video encoding must be done in the background.
  2. Video encoding should be the same on all devices.

Unfortunately, we didn’t get to learn much about the implementation and rendering via MediaCodec. We were stopped by a large number of articles describing its functionality as well as the information on a huge number of bugs hidden (which depend on the mobile device manufacturer) behind the method of recording OpenGL filter into a separate video. So we gave up on that option.

So the only option left was — adding a custom OpenGL filter to FFmpeg. We found an example of such filter integration on Github. The main characteristic was the use of OpenGL, and we needed to port it to OpenGL ES since that was the OpenGL version used on Android.

Adding OpenGL to FFmpeg

So, we chose to go with a custom OpenGL filter to FFmpeg to render a video file. The choice was made due to the following reasons:

  1. Same result on all devices.
  2. Using commands from FFmpeg for editing with our OpenGL-filter.
  3. A wide range of source video editing capabilities that FFmpeg has out of the box.

But with all those advantages in place, does this solution have any disadvantages? There is actually one, and it is quite serious — FFmpeg doesn’t support hardware acceleration on Android. A good part of video rendering will be CPU intensive, and this will badly affect the performance and encoding speed of a video file.

So, we were actually at the crossroads: to opt for slow but stable video encoding or to choose a relatively fast option but hardly supported by all Android-based devices. It was decided to go for option one. And on top of that, we already had a library integrated into the app which allowed to work with FFmpeg, but with limited capabilities.

The choice was made and our next steps were to add an OpenGL filter to FFmpeg and then add all of that to an Android app, which was a long and challenging task. In general, the way we did it was the following:

  1. Build FFmpeg.
  2. Add OpenGL filter to FFmpeg and build FFmpeg with this filter inside.
  3. Port OpenGL Filter to Android.
  4. Build FFmpeg with a custom OpenGL filter.
  5. Integrate FFmpeg build with a custom filter into our application and test on different devices.

Building FFmpeg for Android

FFmpeg is a collection of libraries for handling media files, which has to be built manually for each platform to be further used in the application. Initially, we wanted to do everything by ourselves, without using additional libraries. But the task turned out to be too difficult — we tried various options and read many articles on this topic, and weren’t able to build FFmpeg for Android.

After further research, we found the repository to help us build FFmpeg. It had many advantages: detailed documentation, the latest version of FFmpeg, and an active community. After reading the documentation and spending some time on it, we managed to build FFmpeg for Android and integrated it into the app, thereby replacing the library with the already built FFmpeg that we used before.

Porting OpenGL Filter to Android

Here we come closer to the funniest part: adding OpenGL filter to FFmpeg. We looked for similar solutions studying different resources and found one interesting repository that implements a small OpenGL shader, which was turned into a custom filter for FFmpeg.

The OpenGL shader we found was written for OpenGL, but Android uses OpenGL ES — a subset of OpenGL for embedded systems. Thus, we needed to port OpenGL to OpenGL ES.

Now I need to briefly describe the meaning of a custom filter for FFmpeg. As we already know, FFmpeg is a collection of libraries for handling media files. In fact, anyone can create their own filter that would process the incoming audio or video stream, using, for example, this information, as well as studying the code of other filters. Then, this filter should be added to the FFmpeg build script in order to access it from the application.

We replaced the original shader with a shader that just fills the video with a red color for easier porting. It took us long to change different packages and libraries for the FFmpeg build and the code of the filter itself. We managed to render the video, which was a five-second fragment, completely filled with a red color. You can find the `transition` filter ported to OpenGL ES here.

Porting OpenGL Shader from Application to FFmpeg

It was necessary to move a lot to a custom filter for FFmpeg since our OpenGL shader was quite large (together, with all its code), including OpenGL initialization, texture generation, etc. As a result, we had a huge class that combined the logic of OpenGL and FFmpeg. While implementing it, we encountered serious issues:

  1. The complexity of implementation. For the first time in our practice, we were working with OpenGL and FFmpeg. We had never written any custom filters for FFmpeg before and of course, had never built FFmpeg for Android, thereby had to face some “childish” mistakes that were occurring in the most unexpected parts.
  2. Building FFmpeg. To build FFmpeg for one CPU architecture took 6 minutes at least. And there is nothing to worry about if FFmpeg is built once or twice a week, but if you need to do it several times a day, then this becomes a big unpleasant obstacle. For example, during debugging any change in the filter code requires FFmpeg rebuilding.
  3. Debugging. To troubleshoot errors of a filter during the development you need to work with three components:
  • Errors in the shader
  • Compile-time errors
  • OpenGL / FFmpeg implementation errors

It was easy to catch shader errors and other errors during compilation, and immediately find the main cause of it. But at the same time, OpenGL / FFmpeg processing errors led us to some guesswork, because it was not just clear where the error had occurred. Initially, in that case, we just had a SIGSEGV error and that was all. In the most incomprehensible cases, we had to set logs in the source code of the filter for each change and then catch the crash on a particular line. Interesting cases were there too, for example, a crash on the tenth iteration of the filter rendering. Sometimes we had to play a hunch.

Key Takeaways

So after all of that, we had a stable filter that could access a script for FFmpeg along with other built-in filters (for example, scale) and also could add various transformations to the video: for example, change fps and arrange keyframes. But there are still some disadvantages.

First of all, it is the speed of rendering a video file through FFmpeg. Unfortunately, FFmpeg currently uses CPU for rendering and because of that, the processing time is significantly longer than the length of the original video, especially on slow devices. In addition, we discovered a peculiarity related to different processors, I also describe it in this repository.

--

--

Rosberry
@RosberryApps

Rosberry is a mobile app design and development company based out of Thailand. We design, build, test, deploy and support apps at scale.