How to use Gstreamer AppSrc in Python

5 min. read |

With gstreamer appsrc element it easy to push buffers (ex.: numpy arrays) into gstreamer pipeline. Developer can benefit from a variety of already implemented gstreamer plugins and display image in window, write frame to a video file or send buffers over TCP/HTTP.

Requirements

Code

Learn how to?

When use AppSrc?

Working with videos using OpenCV there were some problems with video display (can’t be launched in multiple threads) and video recording (high volume of recorded video files, or requires custom library build with specific options).

With Gstreamer it is much easier as displaying/recording can be executed in multiple threads, is flexible due to variety of gstreamer plugins (tcp, http, window, video file, descriptor, ..). Of course there are some problems: installation (sometimes it is painful to install gstreamer), buffers duplication (when converting to Gst.Buffer).

Note: Further reading highly correlates with previous post on how to receive buffers from gstreamer pipeline in Python: “How to use Gstreamer Appsink in Python

Guide

Define gstreamer pipeline

First, let us define simple pipeline with appsrc element that is going to accept RGB buffers with shape 640×480 and display them in window with constant speed 30 frames per second.

gst-launch-1.0 appsrc emit-signals=True is-live=True \
 caps=video/x-raw,format=RGB,width=640,height=480,framerate=30/1 ! \
 queue max-size-buffers=4 ! videoconvert ! autovideosink

Note: queue is added to store buffers in exact order without buffers drop in case of overload. Limit queue size to max-size-buffers to reduce memory consumption (as when overload the size of queue could be huge)

Write python script

Import classes to work with Gstreamer pipeline in Python.

from gstreamer import GstContext, GstPipeline

Initialize GstContext that hides all GLib.MainLoop routine to handle gstreamer events.

with GstContext():

Initialize Gstreamer pipeline with a command defined previously (but omit gst-launch-1.0 keyword)

pipeline = GstPipeline(command)

Setup AppSrc element

In order to get appsrc from pipeline use next line of code

appsrc = pipeline.get_by_cls(GstApp.AppSrc)[0]  # get AppSrc

Now, let instruct gstreamer appsrc element that we will be dealing with timed buffers. Check out all the options in Gst.Format.

appsrc.set_property("format", Gst.Format.TIME)        

Note: most stream muxers work with timed buffers.

Set target buffer format for appsrc.

CAPS = "video/x-raw,format=RGB,width=640,height=480,framerate=30/1"
appsrc.set_caps(Gst.Caps.from_string(CAPS))  # set caps

Note: format (colorspace), width, height, fps are required for appsrc work properly

Additionally, setup appsrc to block accepting new coming buffers in case of internal queue overload, otherwise a lot of buffers are going to be dropped.

appsrc.set_property("block", True)

Attention, all previous setup should be done before pipeline started. For this purpose class GstPipeline has method on_pipeline_init that is going to be executed when Gst.Pipeline inialized but not started yet. Let override current method:

def on_pipeline_init(self):
    appsrc = self.get_by_cls(GstApp.AppSrc)[0]       
    appsrc.set_property("format", Gst.Format.TIME)      
    appsrc.set_property("block", True)       
    appsrc.set_caps(Gst.Caps.from_string(CAPS))

pipeline._on_pipeline_init = on_pipeline_init.__get__(pipeline)

Now, start pipeline

pipeline.startup()

Let now generate random buffers (np.ndarray) and try to push them in our pipeline.

 for _ in range(NUM_BUFFERS):  
     array = np.random.randint(low=0, high=255, \
                               size=(HEIGHT, WIDTH, CHANNELS),\
                               dtype=DTYPE)    

In order to convert np.ndarray to Gst.Buffer you can use next approach:

def ndarray_to_gst_buffer(array: np.ndarray) -> Gst.Buffer:
    """Converts numpy array to Gst.Buffer"""
    return Gst.Buffer.new_wrapped(array.tobytes())

Or, just use ndarray_to_gst_buffer from gstreamer-python utils

import gstreamer.utils as utils
appsrc.emit("push-buffer", utils.ndarray_to_gst_buffer(array))

When we are out of buffers simply notify appsrc with end-of-stream event

appsrc.emit("end-of-stream")

Stop pipeline at the end

pipeline.shutdown()

Run example

Now, let put everything together and run examples. At the beginning clone gst-python-tutorials repository and install requirements.

git clone https://github.com/jackersson/gst-python-tutorials.git
cd gst-python-tutorials

python3 -m venv venv
source venv/bin/activate

pip install --upgrade wheel pip setuptools
pip install --upgrade --requirement requirements.txt

Simple displaying

First, launch simple display of randomly generated numpy arrays in a window.

python launch_pipeline/run_appsrc.py 

Display with framerate

Let’s, make it more difficult and check that video is playing with specified frame rate (30). For this purpose we’ll use fpsdisplaysink plugin that draws frame rate over a video or prints it to console.

But in order to calculate frame rate gstreamer’s fpsdisplaysink require pts and duration to be present and valid in Gst.Buffer. Type of each variable is GLib.MAXUINT64 (in nano seconds). So, just modify previous code with similar code

pts = 0  # buffers presentation timestamp
duration = 10**9 / FPS  # frame duration

for _ in range(NUM_BUFFERS):
    pts += duration
    gst_buffer.pts = pts
    gst_buffer.duration = duration

Perfect, now launch pipeline with fpsdisplaysink

python launch_pipeline/run_appsrc.py -p \
"appsrc emit-signals=True is-live=True \
caps=video/x-raw,format=RGB,width=640,height=480,framerate=30/1 ! \
 queue max-size-buffers=4 ! videoconvert ! fpsdisplaysink"

You should see something like this. And in average stream’s frame rate almost equal to 30 frames per second.

Writing video to file

In order to write video to file we need to convert buffers to H264 video stream (using x264enc plugin), place it into MPEG-4 media container (mp4mux) and record to file with filesink

python launch_pipeline/run_appsrc.py -p \
"appsrc emit-signals=True is-live=True \
caps=video/x-raw,format=RGB,width=640,height=480,framerate=30/1 ! \
 queue max-size-buffers=4 ! videoconvert ! \
x264enc pass=quant tune=zerolatency ! mp4mux !\
 filesink location=video.mp4"

Note:

  • pass=quant indicates that bitrate of video stream is going to be best.
  • tune=zerolatency reduces the encoding latency, but influences on video quality.

Now in current folder should be located short video file named “video.mp4“. Check the content of it and compare with displaying video previously.

Play with different gstreamer commands and check that everything works.

Conclusion

In this post we learned how to:

  • setup gstreamer appsrc element to accept buffers from user applications
  • convert np.ndarray to Gst.Buffer
  • display randomly generated buffers in window
  • write randomly generated buffers into a video file
  • setup and display video’s framerate

Hope everything worked as expected 🙂 In case of any troubles, suggestions leave a comment.

9 Comments

Add a Comment

Your email address will not be published. Required fields are marked *