How to use Gstreamer AppSrc in Python
5 min. read |
With gstreamer appsrc element it easy to push buffers (ex.: numpy arrays) into gstreamer pipeline. Developer can benefit from a variety of already implemented gstreamer plugins and display image in window, write frame to a video file or send buffers over TCP/HTTP.
- Ubuntu 18
- Python 3.6
- Gstreamer with Python Bindings. Look at “How to install Gstreamer Python Bindings on Ubuntu 18”
Learn how to?
- push Gst.Buffer in pipeline with appsrc
- convert numpy array to Gst.Buffer
- setup GstApp.AppSrc to push buffers
- display images using autovideosink
- record video into video file using x264enc, mp4mux and filesink
- setup proper frame rate (fpsdisplaysink)
When use AppSrc?
Working with videos using OpenCV there were some problems with video display (can’t be launched in multiple threads) and video recording (high volume of recorded video files, or requires custom library build with specific options).
With Gstreamer it is much easier as displaying/recording can be executed in multiple threads, is flexible due to variety of gstreamer plugins (tcp, http, window, video file, descriptor, ..). Of course there are some problems: installation (sometimes it is painful to install gstreamer), buffers duplication (when converting to Gst.Buffer).
Note: Further reading highly correlates with previous post on how to receive buffers from gstreamer pipeline in Python: “How to use Gstreamer Appsink in Python”
Define gstreamer pipeline
First, let us define simple pipeline with appsrc element that is going to accept RGB buffers with shape 640×480 and display them in window with constant speed 30 frames per second.
gst-launch-1.0 appsrc emit-signals=True is-live=True \ caps=video/x-raw,format=RGB,width=640,height=480,framerate=30/1 ! \ queue max-size-buffers=4 ! videoconvert ! autovideosink
Note: queue is added to store buffers in exact order without buffers drop in case of overload. Limit queue size to max-size-buffers to reduce memory consumption (as when overload the size of queue could be huge)
Write python script
Import classes to work with Gstreamer pipeline in Python.
from gstreamer import GstContext, GstPipeline
Initialize GstContext that hides all GLib.MainLoop routine to handle gstreamer events.
Initialize Gstreamer pipeline with a command defined previously (but omit gst-launch-1.0 keyword)
pipeline = GstPipeline(command)
Setup AppSrc element
In order to get appsrc from pipeline use next line of code
appsrc = pipeline.get_by_cls(GstApp.AppSrc) # get AppSrc
Now, let instruct gstreamer appsrc element that we will be dealing with timed buffers. Check out all the options in Gst.Format.
Note: most stream muxers work with timed buffers.
Set target buffer format for appsrc.
CAPS = "video/x-raw,format=RGB,width=640,height=480,framerate=30/1" appsrc.set_caps(Gst.Caps.from_string(CAPS)) # set caps
Note: format (colorspace), width, height, fps are required for appsrc work properly
Additionally, setup appsrc to block accepting new coming buffers in case of internal queue overload, otherwise a lot of buffers are going to be dropped.
Attention, all previous setup should be done before pipeline started. For this purpose class GstPipeline has method on_pipeline_init that is going to be executed when Gst.Pipeline inialized but not started yet. Let override current method:
def on_pipeline_init(self): appsrc = self.get_by_cls(GstApp.AppSrc) appsrc.set_property("format", Gst.Format.TIME) appsrc.set_property("block", True) appsrc.set_caps(Gst.Caps.from_string(CAPS)) pipeline._on_pipeline_init = on_pipeline_init.__get__(pipeline)
Now, start pipeline
Let now generate random buffers (np.ndarray) and try to push them in our pipeline.
for _ in range(NUM_BUFFERS): array = np.random.randint(low=0, high=255, \ size=(HEIGHT, WIDTH, CHANNELS),\ dtype=DTYPE)
In order to convert np.ndarray to Gst.Buffer you can use next approach:
def ndarray_to_gst_buffer(array: np.ndarray) -> Gst.Buffer: """Converts numpy array to Gst.Buffer""" return Gst.Buffer.new_wrapped(array.tobytes())
Or, just use ndarray_to_gst_buffer from gstreamer-python utils
import gstreamer.utils as utils appsrc.emit("push-buffer", utils.ndarray_to_gst_buffer(array))
When we are out of buffers simply notify appsrc with end-of-stream event
Stop pipeline at the end
Now, let put everything together and run examples. At the beginning clone gst-python-tutorials repository and install requirements.
git clone https://github.com/jackersson/gst-python-tutorials.git cd gst-python-tutorials python3 -m venv venv source venv/bin/activate pip install --upgrade wheel pip setuptools pip install --upgrade --requirement requirements.txt
First, launch simple display of randomly generated numpy arrays in a window.
Display with framerate
Let’s, make it more difficult and check that video is playing with specified frame rate (30). For this purpose we’ll use fpsdisplaysink plugin that draws frame rate over a video or prints it to console.
But in order to calculate frame rate gstreamer’s fpsdisplaysink require pts and duration to be present and valid in Gst.Buffer. Type of each variable is GLib.MAXUINT64 (in nano seconds). So, just modify previous code with similar code
pts = 0 # buffers presentation timestamp duration = 10**9 / FPS # frame duration for _ in range(NUM_BUFFERS): pts += duration gst_buffer.pts = pts gst_buffer.duration = duration
Perfect, now launch pipeline with fpsdisplaysink
python launch_pipeline/run_appsrc.py -p \ "appsrc emit-signals=True is-live=True \ caps=video/x-raw,format=RGB,width=640,height=480,framerate=30/1 ! \ queue max-size-buffers=4 ! videoconvert ! fpsdisplaysink"
You should see something like this. And in average stream’s frame rate almost equal to 30 frames per second.
Writing video to file
In order to write video to file we need to convert buffers to H264 video stream (using x264enc plugin), place it into MPEG-4 media container (mp4mux) and record to file with filesink
python launch_pipeline/run_appsrc.py -p \ "appsrc emit-signals=True is-live=True \ caps=video/x-raw,format=RGB,width=640,height=480,framerate=30/1 ! \ queue max-size-buffers=4 ! videoconvert ! \ x264enc pass=quant tune=zerolatency ! mp4mux !\ filesink location=video.mp4"
- pass=quant indicates that bitrate of video stream is going to be best.
- tune=zerolatency reduces the encoding latency, but influences on video quality.
Now in current folder should be located short video file named “video.mp4“. Check the content of it and compare with displaying video previously.
Play with different gstreamer commands and check that everything works.
In this post we learned how to:
- setup gstreamer appsrc element to accept buffers from user applications
- convert np.ndarray to Gst.Buffer
- display randomly generated buffers in window
- write randomly generated buffers into a video file
- setup and display video’s framerate
Hope everything worked as expected 🙂 In case of any troubles, suggestions leave a comment.
I built gstreamer on an Ubuntu 18 PC by following your tutorial:
Then, I built gstreamer Python bindings (gst-python) as described here: http://lifestyletransfer.com/how-to-install-gstreamer-python-bindings/.
I installed everything here: ~/work/gst/build-gst/install in order to keep system directories clean (/lib, /usr/lib etc.).
I used gst-plugin-check.py to verity that my installation works properly.
Now I would like to use an appsrc application. First of all, I would like to run your test application described in this article against the stuff I placed in ~/work/gst/build-gst/install. I suppose I have to set some environment variables to do that.
1) Can you tell me how to set up the environment to do that?
2) Beside gstreamer Python bindings, in this article you say that one needs gstreamer-python as well (https://github.com/jackersson/gstreamer-python). What’s the purpose of this package?
Thank you in advance for your help.