Video appears after a random delay in Streaming Plugin

Hey there! An amazingly flexible project you guys crafted there! Great job, thanks for it!

However I’m facing a very frustrating issue, which I couldn’t resolve past few days. I have an android app, that captures the screen and sends it to my Janus Server via a Streaming Plugin Handle. (for that purpose, I’m using an open-source implementation called Apuppet-android apuppet-android/app/src/main/java/com/hmdm/control/janus/JanusStreamingPlugin.java at 88406391ce5f8a5e730bc82b8c098d3fd22fcf4a · MrYoda/apuppet-android · GitHub) Which effectively creates the Streaming Plugin session with:

{
   message: {
      request: "create",
      type: "rtp",
      permanent: false,
      is_private: false,
      video: true,
      audio: false,
      id: "randomlyGenerated8Letters",
      pin: "randomlyGenerated4Letters",
      videoport: 0,
      videopt: 100,
      videortpmap: "H264/90000",
      videobufferkf: true,
   }
}

So far so good. Then I implemented a web client (using janus.js), that joins the session and sends a simple “watch” request, and then “start” request right when I receive the JSEP. The problem is, the client receive the Video MediaTrack as muted initially, and it takes a random delay (varies a lot, from 10 seconds to 10 minutes) to receive it unmuting itself.

Remote track muted: Event {isTrusted: true, type: 'mute', target: MediaStreamTrack, currentTarget: MediaStreamTrack, eventPhase: 2, …}
Remote track flowing again: Event {isTrusted: true, type: 'unmute', target: MediaStreamTrack, currentTarget: MediaStreamTrack, eventPhase: 2, …}

When I debug using chrome://webrtc-internals, I see it is actually sending a keyframe very frequently. The delay cannot be the keyframe frequency. But at this point I am very much lost, I don’t know what to check next. Could you please guide me on that?

Thanks a lot in advance :slightly_smiling_face:

When creating H.264 mountpoints you should specify the fmtp as well (I think the sample mountpoint we have for H.264 provides an example). Janus doesn’t touch the media, so if SDP is not the problem, any issue you have with video is most of the times a problem on the source. Make sure RTP packets are not too large (WebRTC adds overhead in SRTP and extensions, which may cause size to exceed the MTU). Make also sure keyframes are not too frequent: WebRTC is not HLS, you don’t need a keyframe every 1-2 seconds.

Try recording the mountpoint to an .mjr file, and converting the .mjr to an .mp4 with janus-pp-rec. That will tell you if at least media players think the video you’re sending is ok or not. If not, that might explain why browsers don’t like it either.

1 Like

Thanks a lot for the reply :slight_smile: I’ll try to see if streamed media has some flags or configuration missing, as it definitely works as expected when I stream using ffmpeg cli

Facing a similar issue here; have you got it sorted out? Also using apuppet-android