Nvcompositor documentation or alternative

In the context of a video pipeline , before the video sink nv3dsink i’m inserting a nvcompositor to mix a GUI surface and the live video.

What are the requirements for the nvcompositor sink_X

  • format does the inputs needs to have the same resolution ?
  • does the inputs needs to have the same format ?
  • can we modify sink_X resolution on the fly (dynamically)

or point

in the provided example

  • the videotest src in 640x360 as well as the sink_1
  • the live video is 1280x720 as sink_0

but the sink_1 is upscaled to 1280x720

thks

pipeline

v4l2src device=/dev/video0 io-mode=4 ! image/jpeg,width=1280,height=720,framerate=30/1 ! jpegdec ! nvvidconv ! video/x-raw(memory:NVMM),format=RGBA,width=1280,height=720 ! tee name=t t. ! queue ! appsink name=sink t. ! queue name=mux_queue ! identity name=delay_element sleep-time=1000 nvstreammux name=stream-muxer width=1280 height=720 batch-size=1 live-source=TRUE ! nvtracker name=tracker ! nvdsosd name=osd display-bbox=true display-clock=false process-mode=1 ! nvcompositor name=comp sink_0::xpos=0 sink_0::ypos=0 sink_0::width=1280 sink_0::height=720 sink_0::zorder=0 sink_1::xpos=0 sink_1::ypos=0 sink_1::width=1280 sink_1::height=720 sink_1::zorder=1 ! nv3dsink sync=false videotestsrc is-live=true ! video/x-raw,format=RGBA,width=640,height=360,framerate=30/1 ! queue ! nvvidconv ! video/x-raw(memory:NVMM),width=640,height=360,format=RGBA ! comp.sink_1

Jetson Orin nano/NX
DS 7.1

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

The nvcompositor plugin is not compatible with DeepStream SDK plugins. Suggest using DeepStream plugin nvmultistreamtiler, which is used to composite a 2D tile from batched buffers. Here is sample for how to use nvmultistreamtiler.

 gst-launch-1.0 filesrc  location=/opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264 ! h264parse ! nvv4l2decoder ! mux.sink_0  \
	   filesrc  location=/opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264 ! h264parse ! nvv4l2decoder ! mux.sink_1 \
   nvstreammux name=mux batch-size=2 width=1920 height=1080 nvbuf-memory-type=3 ! nvmultistreamtiler !  nveglglessink

Can you explain why it is not compatible with"Deepstream" , is it a gstreamer plug-in ?

N

Try to run the sample but failed.

gst-launch-1.0 filesrc  location=/opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264 ! h264parse ! nvv4l2decoder ! mux.sink_0  \
           filesrc  location=/opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264 ! h264parse ! nvv4l2decoder ! mux.sink_1 \
   nvstreammux name=mux batch-size=2 width=1920 height=1080 nvbuf-memory-type=3 ! nvmultistreamtiler !  nveglglessink
Setting pipeline to PAUSED ...

Using winsys: x11 
Opening in BLOCKING MODE 
Opening in BLOCKING MODE 
Pipeline is PREROLLING ...
Got context from element 'eglglessink0': gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
NvMMLiteOpen : Block : BlockType = 261 
NvMMLiteOpen : Block : BlockType = 261 
NvMMLiteBlockCreate : Block : BlockType = 261 
NvMMLiteBlockCreate : Block : BlockType = 261 
/dvs/git/dirty/git-master_linux/nvutils/nvbufsurftransform/nvbufsurftransform.cpp:4550: => Surface type not supported for transformation NVBUF_MEM_CUDA_UNIFIED

/dvs/git/dirty/git-master_linux/nvutils/nvbufsurftransform/nvbufsurftransform.cpp:4550: => Surface type not supported for transformation NVBUF_MEM_CUDA_UNIFIED

^Chandling interrupt.
Interrupt: Stopping pipeline ...
Setting pipeline to NULL ...
Freeing pipeline ...

Looking at the documentation of how do you expect that I do the same thing ?

Who is deciding it is not compatible any more ?

GStreamer version 1.0 includes the following proprietary NVIDIA plugins:

±--------------------------±----------------------------------------+
| NVIDIA proprietary plugin | Description |
+===========================+=========================================+
| nvarguscamerasrc | Camera plugin for ARGUS API |
| | |
| | .. todo:: |
| | ARGUS is used in all caps in several |
| | places, something I have not seen |
| | anywhere else. Is there a reason why |
| | it’s used that way here? |
| | |
±--------------------------±----------------------------------------+
| nvv4l2camerasrc | Camera plugin for V4L2 API |
±--------------------------±----------------------------------------+
| nvvidconv | Video format conversion and scaling |
±--------------------------±----------------------------------------+
| nvcompositor | Video compositor |
±--------------------------±----------------------------------------+
| nveglstreamsrc | Acts as GStreamer Source Component, |
| | accepts EGLStream from EGLStream |
| | producer |
±--------------------------±----------------------------------------+
| nvvideosink | Video Sink Component. Accepts YUV-I420 |
| | format and produces EGLStream (RGBA) |
±--------------------------±----------------------------------------+
| nvegltransform | Video transform element for NVMM to |
| | EGLimage (supported with nveglglessink |
| | only) |
±--------------------------±----------------------------------------+

  1. correct the sample command-line for Jetson.
gst-launch-1.0 filesrc  location=/opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264 ! h264parse ! nvv4l2decoder ! mux.sink_0  \
	   filesrc  location=/opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264 ! h264parse ! nvv4l2decoder ! mux.sink_1 \
   nvstreammux name=mux batch-size=2 width=1920 height=1080  ! nvmultistreamtiler !  fakesink
  1. Since nvcompositor is not deepstream plugin, it may not work with some deepstream plugins. please refer to this topic. hence Suggest use DeepStream plugn nvmultistreamtiler.

I want to mix the video flow with a GUI (Dear imgui) what would you recommand to mix DS and a GUI ?

thks

you might use this pipeline “…->nvmultistreamtiler->nvideoconvert->appsink”. nvmultistreamtiler is responsible for mixing the videos. nvideoconvert converts the video format. then you can call the third-part display lib in appsink for rendering.

Hi Fanzh, please refer to my pipeline. Please take the time to read and understand the questions.

From your pipeline, the command-line mixes two sources. One is from v4l2src and the other is from videotestsrc. modifing my last comand-line can meet your requirement. To be specific, you can repalce the filesrc with new source plugin, then nvstreammux forms a batch of frames from multiple input sources, then nvinfer, nvtracker and other plugns may be inserted after nvstreammux, then nvmultistreamtiler composites a 2D tile from batched buffers, finally you may use nv3dsink or appsink to display the buffers.

I do NOT want to make a composite side by side, but on to each other with alpha blendind with priority according to zorder parameter.
thks

Thanks for the sharing! nvmultistreamtiler only supports compositing side by side currently. nvcompositor supports overlaying compositing but without alpha blending.

I know. I quote you

So want I want is documentation to use Nvcompositor , when are you talking about nvmultistreamtiler, are you an AI ?

thanks

Hi,
The nvcompositor plugin is not supported in DeepStream SDK. If you have DeepStream plugins in your gstreamer pipeline, please use nvmultistreamtiler plugin.

This like apple and orange, you cannot compare them, and i’m using it as a Gstreamer element.

To illustrate to make sure that is not communication barrier, the result of mixing graphics and processed video, and how to do the same with the tiler that you are recommended.

Thks

Michel C.

Hi,
The nvmultistreamtiler plugin doesn’t support this use-case. The supported use-case is to tile the sources vertically or horizontally.

As an application engineer, what would recommend ? nvcompositor does not work as expected and is not supported, you propose a solution based on the tiler element which in fact not a solution, and making the link to another issue :

**

So are telling me that with Jetson type device we cannot overlay graphic and live video

**.

Spec that Nvidia contribute and looks like decided not to support.
https://registry.khronos.org/OpenGL/extensions/OES/OES_EGL_image_external.txt

Hi,
As suggested you can map NVMM buffer to NvBufSurface, and then map to EglImage. This is supported. If you create EglImage first and would like to map it to NVMM buffer, this is not supported.

In your use-case, does it work to map NVMM buffer to NvBufSurface, map to EglImage, and use the EglImage in the gui app? If you must create EglImage in the gui app, memory copy is required.

Let’s continue in
Zero-copy EGLImage texture upload? - #23 by DaneLLL

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.