Connecting to stream endpoints with gstreamer

Is it possible to use gstreamer as receiving client?
What i mean is: I need to record ongoing stream, but do it server-side (automatically on api request).
I have almost given up because i cannot find a way to process MediaStream object in node environment. Instead, i thought about implementing gstreamer to receive that data.
Unfortunately, gstreamer cannot process MediaStream, but can it use mountpoint as source in pipeline?

GStreamer definetly can work with WebRTC, but it’s not clear what exactly you mean by

gstreamer cannot process MediaStream

I mean that gstreamer cannot act as receiving client? Or am I wrong? Typically, receiving streams in Janus is done by javascript, and gstreamer is used for sending TO janus. I meant gstreamer receiving stream from janus is not so straightforward if not impossible, at least for me

It’s possible to use GStreamer with WebRTC for both directions. But I think you have misunderstanding here:

  1. If you need to get streams from Janus you can use RTP forwarders (from Janus side) and something based on udpsrc from GStreamer, and it’s server side solution.
  2. If you need to implement complete WebRTC client with GStreamer - you’ll have to use webrtcbin element from GStreamer. But it’s a little bit tricky. I did something similar here: GitHub - RSATom/JanusVideoroomStreamer: [POC] Restream from any source supported by GStreamer to Janus Videoroom.

I know it is possible to forward streams in videoroom plugin, but unfortunately i use streaming plugin. Is there any working solution for that?

Unfortunately I can’t understand why it can be required to record something from streaming plugin… What is the initial source of stream?

Besides, the Streaming plugin has built-in recording functionality, to MJR.
That said, GStreamer based WebRTC clients are most definitely doable: just check my WHIP and WHEP clients for a practical example in C (but bindings to other languages exist in GStreamer).