I get no unmute event for some codecs when using WebRTC streaming

I’m streaming a local usb camera and set up a gstreamer 1.20.3 pipeline with vp8 encoding to a local udpsink. (no audio)
Then I run a janus 1.2.0 server with streaming plugin and rtp multistream configuration to stream the video to a browser.
In the browser I use the original streamingtest from the repo (just changed the urls in the settings)

When the stream track is starting in the frontend, then I immediately get a mute event and a bit later I get an unmute event and the video is showing. (This happens in the official janus streaming demo, too)

When I however change the codec in the gstreamer to h264 or av1 then I do not get the unmute event after the innitial mute event and the video is not showing. If I restart the gstreamer while the janus is still running, then I get an unmute event and the video i showing in the browser.

My questions:

  1. Why is there a mute event in the first place? Is it janus or gstreamer related?
  2. I assume the mute / unmute events are triggered in the backend, not in the browser. What might be the cause for that?
  3. Is there any way to trigger the unmute manually in the backend? Maybe for a workaround if everything else fails.

I spent many hours trying to identify the issue, with no success. Does anyone have an idea what this might be?

Very likely a missing keyframe when the WebRTC is started, which seems to be confirmed by the fact it works when you restart GStreamer. If you wait long enough, you’ll get video and that unmuted event. You’ll have to configure the pipeline to send a keyframe on a regular basis (but not too often).

That was it! Thanks Lorenzo. I added more frequent keyframes in my gstreamer h264 encoding pipeline and everything works as expected now.