How to use Gstreamer AppSrc in Python

5 min. read |

With gstreamer appsrc element it easy to push buffers (ex.: numpy arrays) into gstreamer pipeline. Developer can benefit from a variety of already implemented gstreamer plugins and display image in window, write frame to a video file or send buffers over TCP/HTTP.

Requirements

Code

Learn how to?

When use AppSrc?

Working with videos using OpenCV there were some problems with video display (can’t be launched in multiple threads) and video recording (high volume of recorded video files, or requires custom library build with specific options).

With Gstreamer it is much easier as displaying/recording can be executed in multiple threads, is flexible due to variety of gstreamer plugins (tcp, http, window, video file, descriptor, ..). Of course there are some problems: installation (sometimes it is painful to install gstreamer), buffers duplication (when converting to Gst.Buffer).

Note: Further reading highly correlates with previous post on how to receive buffers from gstreamer pipeline in Python: “How to use Gstreamer Appsink in Python

Guide

Define gstreamer pipeline

First, let us define simple pipeline with appsrc element that is going to accept RGB buffers with shape 640×480 and display them in window with constant speed 30 frames per second.

Note: queue is added to store buffers in exact order without buffers drop in case of overload. Limit queue size to max-size-buffers to reduce memory consumption (as when overload the size of queue could be huge)

Write python script

Import classes to work with Gstreamer pipeline in Python.

Initialize GstContext that hides all GLib.MainLoop routine to handle gstreamer events.

Initialize Gstreamer pipeline with a command defined previously (but omit gst-launch-1.0 keyword)

Setup AppSrc element

In order to get appsrc from pipeline use next line of code

Now, let instruct gstreamer appsrc element that we will be dealing with timed buffers. Check out all the options in Gst.Format.

Note: most stream muxers work with timed buffers.

Set target buffer format for appsrc.

Note: format (colorspace), width, height, fps are required for appsrc work properly

Additionally, setup appsrc to block accepting new coming buffers in case of internal queue overload, otherwise a lot of buffers are going to be dropped.

Attention, all previous setup should be done before pipeline started. For this purpose class GstPipeline has method on_pipeline_init that is going to be executed when Gst.Pipeline inialized but not started yet. Let override current method:

Now, start pipeline

Let now generate random buffers (np.ndarray) and try to push them in our pipeline.

In order to convert np.ndarray to Gst.Buffer you can use next approach:

Or, just use ndarray_to_gst_buffer from gstreamer-python utils

When we are out of buffers simply notify appsrc with end-of-stream event

Stop pipeline at the end

Run example

Now, let put everything together and run examples. At the beginning clone gst-python-tutorials repository and install requirements.

Simple displaying

First, launch simple display of randomly generated numpy arrays in a window.

Display with framerate

Let’s, make it more difficult and check that video is playing with specified frame rate (30). For this purpose we’ll use fpsdisplaysink plugin that draws frame rate over a video or prints it to console.

But in order to calculate frame rate gstreamer’s fpsdisplaysink require pts and duration to be present and valid in Gst.Buffer. Type of each variable is GLib.MAXUINT64 (in nano seconds). So, just modify previous code with similar code

Perfect, now launch pipeline with fpsdisplaysink

You should see something like this. And in average stream’s frame rate almost equal to 30 frames per second.

Writing video to file

In order to write video to file we need to convert buffers to H264 video stream (using x264enc plugin), place it into MPEG-4 media container (mp4mux) and record to file with filesink

Note:

  • pass=quant indicates that bitrate of video stream is going to be best.
  • tune=zerolatency reduces the encoding latency, but influences on video quality.

Now in current folder should be located short video file named “video.mp4“. Check the content of it and compare with displaying video previously.

Play with different gstreamer commands and check that everything works.

Conclusion

In this post we learned how to:

  • setup gstreamer appsrc element to accept buffers from user applications
  • convert np.ndarray to Gst.Buffer
  • display randomly generated buffers in window
  • write randomly generated buffers into a video file
  • setup and display video’s framerate

Hope everything worked as expected 🙂 In case of any troubles, suggestions leave a comment.

Add a Comment

Your email address will not be published. Required fields are marked *