Hey everyone,
I’m new to WebRTC and super grateful for the awesome content you’ve shared. Currently, I’m working on a project where I need to record a user’s webcam, let them watch it, and if they like it, store it in a database.
I’ve been using the recordplay plugin, which is great, but here’s the hitch—I want to show the recorded video in the browser’s video player as an MP4 file and additionally videos must stored in the AWS. I saw a suggestion to use janus-pp-rec, but I’m a bit lost on catching the “record stopped” event and formatting the files for display on the front end.
After some poking around, I found two ideas:
- Python File Watcher:
Write a simple Python script to keep an eye on the recording folder. When a new video shows up, convert it to MP4. - Event Handling with HTTP Callbacks:
Use event handlers (like HTTP callbacks) to know when the recording stops. Then, kick off the formatting and signaling process. - Write Your Own Plugin:
Write your own plugin to record stream and save as a mp4 file and return this mp4 file’s path.
Also I don’t know how to these things:
- Return mp4 file url in the same websocket connection.
- How to handle events (I just have this article)
- Writing a plugin, which uses C language I guess and I don’t have that kind of experience right now.
I’m not sure which route to take. If you’ve got any tips or know of an easier way, I’d love to hear it.
Thanks a bunch for your help!