Real-World End-to-End Latency Benchmarks by Protocol

Share on facebook
Share on twitter
Share on linkedin
Real-World End-to-End Latency Benchmarks - Ceeblue

Prospective customers often ask us:

  • What end-to-end latency is actually achievable using the Ceeblue Media Fabric?
  • How will choosing one player or protocol over another affect latency?
  • Is there a hidden asterisk with “actual results may vary” somewhere in the fine print?

The purpose of this article is to provide answers to these questions with as few asterisks as possible.

What end-to-end latency is actually achievable using the Ceeblue Media Fabric?

The Ceeblue Media Fabric can provide latencies as low as sub-200 milliseconds, end-to-end. Of course, the latency of real-time workflows depends on the successful optimization of many different factors at different stages. Each of our customers’ integrations is unique and represents different benefits and challenges, and none will obtain the exact results shown here. We are providing these anecdotal results so that our customers can get a sense of what they can expect when using the most bleeding-edge real-time solution available today.

How will choosing one protocol over another affect latency?

In January 2024, Ceeblue performed a series of real-world tests using different combinations of players, ingest protocols and delivery protocols.

This testing is not exhaustive, and there are protocols and combinations of protocols that we happily support but which were excluded from this process so that we could focus on our most-requested formats.

These realistic benchmarks will not only provide invaluable guidance for our customers, based on real-world results, but will also contribute to a comprehensive report being drafted by the CDN Alliance’s Low Latency Workgroup, which Ceeblue co-chairs, regarding the state of low-latency solutions.

The Testing Environment: True End-to-End Latency

  • Browser : Windows 10, Chrome 120.0.6099.224
  • Broadcaster Location : Tulle, France
  • Ceeblue Node : Fokkerweg , Netherlands
  • Viewer Location : Tulle, France
  • Source : Gstreamer 1.20.4 with epochtime plugin.
  • Stream configuration : 1 video track, h264, 720p, fps=30, segmentation=2s

GStreamer was used to push the stream from the residence of one of our engineers, using a residential internet connection, with a standard, ISP-provided WiFi router.

The engineer then opened the webpage which loads our demo environment. This environment utilizes on-canvas embedded epoch timestamps which record the exact moment the video is streamed from the source browser.

When playback occurs on the viewer’s browser, this timestamp is compared to the current time and the difference between these two times is displayed as the end-to-end latency of the stream.  

Ceeblue provides incredibly fast start times, and in the first second or two a stream will have slightly higher latency (around 1 second for WebRTC, for example); but this latency quickly drops down to a stable latency that is much lower, which our engineer has duly recorded for each and every combination of ingest protocol, output protocol, and player on the table below.

Output ProtocolPlayerPassthrough / TranscodeInput Protocol
RTMPSRTWebRTC1
WebRTCCeebluePassthrough250400240
With Ceeblue transcode300500300
HESPTHEOplayerPassthrough700960700
With Ceeblue transcode8701,130760
LL-HLS2
(CMAF)
hls.jsPassthrough3,2202,8002,900
With Ceeblue transcode3,3303,1003,160
DASH
(CMAF)
dash.jsPassthrough4,9004,9204,900
With Ceeblue transcode5,5005,5005,360
HLS
(MPEG-TS)
hls.jsPassthrough10,00010,1009,800
With Ceeblue transcode10,80011,20010,800
All results in milliseconds.
These measurements reflect the full End-to-End (round-trip) from Tulle, France Oude Meer, Netherlands Tulle, France, which is an approximate distance of 1,628 km / 1,012 mi “as the crow flies.”
  1. Using https://github.com/meetecho/simple-whip-client ↩︎
  2. HLS and LLHLS tests were done with the hls.js default configuration, not optimized for low latency and without moving in the timeline ↩︎

Breaking Down the Results of the Protocol / Player Benchmarking

Right off the bat, these observations stand out:

Input Protocol

  • WebRTC with WHIP ingest was the lowest-latency option (or was tied for lowest) in 8 out of 10 configurations, and had only ≤ 100 ms more latency than the fastest configuration in the two which it did not win.
  • RTMP ingest was a very close second. 
  • SRT, due to its inherent buffering, places third among Input Protocols.
  • The “is-live=true” option is important in Gstreamer videotestsrc/audiotestsrc to get the best latency.

Output Protocol

  • The fragmented protocols result in higher latencies in part because there is more buffering (this may also result in less precise results).
  • WebRTC and HESP are both tremendously lower latency than HLS, DASH and even LL-HLS.
  • The Ceeblue transcoder adds as little as 50 milliseconds of latency.

These Are Actual Results, But Yours Still May Vary

After reviewing these results, another of our engineers replicated the WebRTC testing from another country, and he obtained higher latencies for SRT and lower latencies for both RTMP and WebRTC ingest:

Output ProtocolPlayerPassthrough / TranscodeInput Protocol
RTMPSRTWebRTC
WebRTCCeebluePassthrough208431178
With Ceeblue transcode281528284

Based on these real-world results, as well as our experience with dozens of customers, we can generalize and provide the following ballpark guidance:

ProtocolExpected Latency
WebRTC300 ms
HESP800 ms
LL-HLS3,000 ms
DASH5,000 ms
HLS10,000 ms

As we mentioned above, each workflow has different components and requirements, and these will ultimately determine the tools used, the protocols selected, and as a result the achievable latency. That being said, the above chart represents reasonable, nonspecific latencies that can be expected

HESP Optimization Since Testing

These tests were undertaken four months ago, and since then one of our engineering teams has been hard at work optimizing our HESP implementation in anticipation of some new services we will be launching and announcing at NAB Las Vegas.

We are incredibly excited to announce that with our current implementation of HESP we are achieving latencies previously only achievable using WebRTC, cutting the latencies of HESP by more than half!

Output ProtocolPlayerPassthrough / TranscodeInput Protocol
RTMPSRTWebRTC
HESPCeebluePassthrough240455205
With Ceeblue transcode292516266

HESP is an HTTP-based, CDN- and DRM-friendly, cacheable protocol that will soon be taking the sub-500 millisecond market by storm. Ceeblue, as the only provider of both WebRTC and HESP, are literally doubling down on real-time, and look forward to providing the integral real-time component to even more innovative sub-second solutions, which bring people closer together every day.


Keep your eyes peeled in the coming weeks for future Ceeblue updates regarding our HESP offerings. We have exciting open-source news that will make HESP integration a matter of hours instead of days, just as we have done for WebRTC integration…. Stay tuned.

EXPLORE THE CEEBLUE WEBRTC CLIENT SDK

Source Code
github.com/CeeblueTV/webrtc-client

Prebuilt NPM Package
npmjs.com/package/@ceeblue/webrtc-client

TRY OUT THE WEBRTC VIDEO.JS PLUGIN

github.com/CeeblueTV/videojs-plugins

GET YOUR FREE CEEBLUE ACCOUNT

ceeblue.net/free-trial/

Keep up to date with the latest from Ceeblue

More To Explore

Live Video Transcoding and Streaming in Under a Second. Really.

Follow Us

OFFICES

Netherlands

Germany

Spain

©2021 Ceeblue B.V. All Rights Reserved. Privacy & Cookies Policy