WebRTC Do we need to have a Media server ?
Chrome 65 includes an upper limit to that which is used for garbage collection purposes. Chrome is not going to allow more than 500 concurrent peer connections to exist.
Assume we want to broadcast a video at a low VGA resolution. We checked and decided that 500kbps of bitrate offers good results for our needs.
Broadcasting our stream to 10 people requires bitrate of 5mbps uplink.
f we’re on an ADSL connection, then we can find ourselves with 1-3mbps uplink only, so we won’t be able to broadcast the stream to our 10 viewers.
For the most part, we don’t control where our broadcasters are going to be. Over ADSL? WiFi? 3G network with poor connectivity? The moment we start dealing with broadcast we will need to make such assumptions.
When we use WebRTC for a broadcast type of a service, a lot of decisions end up taking place in the media server. If a viewer has a bad network, this will result with packet loss being reported to the media server. What should the media server do in such a case?
While there’s no simple answer to this question, the alternatives here include:
Asking the broadcaster to send a new I-frame, which will affect all viewers and increase bandwidth use for the near future (you don’t want to do it too much as a media server)
Asking the broadcaster to reduce bitrate and media quality to accommodate for the packet losses, affecting all viewers and not only the one on the bad network
Ignoring the issue of packet loss, sacrificing the user for the “greater good” of the other viewers
Using Simulcast or SVC, and move the viewer to a lower “layer” with lower media quality, without affecting other users
The alternative is to use a Media server
References
https://bloggeek.me/media-server-for-webrtc-broadcast/
Chrome 65 includes an upper limit to that which is used for garbage collection purposes. Chrome is not going to allow more than 500 concurrent peer connections to exist.
Assume we want to broadcast a video at a low VGA resolution. We checked and decided that 500kbps of bitrate offers good results for our needs.
Broadcasting our stream to 10 people requires bitrate of 5mbps uplink.
f we’re on an ADSL connection, then we can find ourselves with 1-3mbps uplink only, so we won’t be able to broadcast the stream to our 10 viewers.
For the most part, we don’t control where our broadcasters are going to be. Over ADSL? WiFi? 3G network with poor connectivity? The moment we start dealing with broadcast we will need to make such assumptions.
When we use WebRTC for a broadcast type of a service, a lot of decisions end up taking place in the media server. If a viewer has a bad network, this will result with packet loss being reported to the media server. What should the media server do in such a case?
While there’s no simple answer to this question, the alternatives here include:
Asking the broadcaster to send a new I-frame, which will affect all viewers and increase bandwidth use for the near future (you don’t want to do it too much as a media server)
Asking the broadcaster to reduce bitrate and media quality to accommodate for the packet losses, affecting all viewers and not only the one on the bad network
Ignoring the issue of packet loss, sacrificing the user for the “greater good” of the other viewers
Using Simulcast or SVC, and move the viewer to a lower “layer” with lower media quality, without affecting other users
The alternative is to use a Media server
References
https://bloggeek.me/media-server-for-webrtc-broadcast/
No comments:
Post a Comment