Another library that I would recommend people look into for live streaming to WebRTC, as an alternative to Pion used in this project, is Janus WebRTC Server. I use it for ingesting RTP streams I generate from usb webcams and then playing it with very low latency in the browser. It even has a feature where you can stream multiple streams simultaneously. It also has a simple http api for adding, updating, and removing streams on demand.
I went to your discourse link. As Lorenzo was trying to say, you will want to get a better idea what is going on with ICE candidate gathering. In firefox there is an about:config setting that might help with this:
If you do not get any valid candidates, then it is most likely a misconfiguration of your browser, vpn, firewall or network. Unfortunately this is most likely not a Janus thing, but I would need more information to know for sure.
I used the demo streaming.html code linked above as-is and host it statically on Nginx alongside Janus on a cloud VPS. As far as config, there is just the one file: /etc/janus/janus.plugin.streaming.jcfg , but you can leave it blank and just use the html api¹ if you don't want to mess with that file as the API has an option:
"permanent":true
that automatically generates/updates the config for you. You can substitute srtp (secure) for regular rtp for testing, but I prefer to use the secure variant since it goes out to a public VPS and doesn't really incur any overhead for my source devices.
I wrote a little helper shell code using wget, openssl, dd, jq, and jo to make it easy to talk JSON to the API, for the one-off configs I do. Here is an example of what I use which demonstrates simulcast and srtp ingestion for both h264 and vp8 video streams as well as opus audio. Just fill in the [ ]'s with your specifics. I then use ffmpeg to generate all the streams and pipe to the appropriate ports for each simulcast level and target. If you use gstreamer beware srtp key format is different.
#!/bin/sh
server="https://[YOUR_JANUS_SERVER_HOST]/janus"
token(){ dd if=/dev/urandom bs=6 count=1 status=none|openssl base64;}
post_jo(){ first="$1";shift;wget --quiet --output-document - --post-data "$(jo -- "$@")" "$server$first";}
tx(){ post_jo "$@" transaction="$(token)";}
data_id(){ jq ".data.id//(null|halt_error(1))";}
message()( set -e
id="$(tx / janus=create|data_id)" # create janus session and store id
hn="$(tx "/$id" janus=attach plugin=janus.plugin.streaming|data_id)" # create plugin session and store id
tx "/$id/$hn" janus=message body="$(jo -- "$@")"
tx "/$id" janus=destroy >/dev/null
)
# example usage:
# list all streams
message request=list|jq .plugindata.data.list
# remove stream with id 666
message request=destroy id=666 secret=adminpwd permanent=true|jq
# create new stream
message request=create id=666 secret=adminpwd permanent=true name=[YOUR_STREAM_NAME] type=rtp media=["$(
jo type=video mid=v1 codec=h264 pt=96 simulcast=true svc=false port=5010 port2=5020 port3=5030 \
fmtp=level-asymmetry-allowed=1\;packetization-mode=1\;profile-level-id=42c01f
)","$(
jo type=video mid=v2 codec=vp8 pt=96 simulcast=true svc=false port=5012 port2=5022 port3=5032
)","$(
jo type=audio mid=a codec=opus pt=97 port=5000
)"] srtpsuite=80 srtpcrypto=zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz|jq # your srtp token needs to match your source
# show new stream info to verify configuration
message request=info id=666 secret=adminpwd|jq
# on streaming source device
ffmpeg [YOUR_INPUT_CONFIG HERE] -f rtp -srtp_out_suite SRTP_AES128_CM_HMAC_SHA1_80 -srtp_out_params zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz srtp://[YOUR_JANUS_SERVER_HOST]/:[PORT]
# for gstreamer, showing alternative key format
gst-launch-1.0 [YOUR_INPUT_CONFIG HERE] ! rtpav1pay ! srtpenc key=cf3cf3cf3cf3cf3cf3cf3cf3cf3cf3cf3cf3cf3cf3cf3cf3cf3cf3cf3cf3 ! udpsink host=[YOUR_JANUS_SERVER_HOST] port=[PORT]
My issue with this (and WebRTC in general) is that most residential users (at least in the US, probably many other countries) are behind some type of symmetrical NAT or CGNAT, often without the ability to forward ports or have working NAT traversal, which makes WebRTC unusable without a TURN server as it relies on communicating between peers via DTLS over UDP.
The problem with needing a TURN server is that you practically need to host it yourself, because there are no fast and reliable public ones, probably due to abuse. WebTorrent also has the same problem since it also uses WebRTC, and as such, 99% of applications that use either technology simply do not work at all for me or anyone else I've asked to try these services.
Donut still uses WebRTC though, which like I said, I still need TURN for and I can prove it. I still have a really hard time believing that only 20% of "users" (what users where? everyone in the world?) need it, as like I said, almost every residential US user needs it at the least.
Back when I was doing live-streaming the primary reason I wanted it was to eliminate (or reduce) the need for a CDN to deliver chunks to the end-user viewers. I fantasized about a protocol adapter like this that would act has the "host" peer that would stream the video from e.g. RTMP source, which would then translate to WebRTC for all the viewer peers in a P2P fashion.
I don't know if this actually works like that, but it's fun to think about this being the missing link I wanted all those years ago.
Thanks for sharing that!
How I usually do it involves manual work by using gatreamer with usb source and into rtp or even sink it to udp port, I will check this out!
> Capture camera stream, encode it in H264/VP8/VP9, and send it to a RTP server
It says here few encodings but below it’s only h264 package, for others I would have to write my own package I assume? Also, any AV1 support if hw allow it?
Here is the library's streaming demo:
https://janus.conf.meetecho.com/demos/streaming.html