UK streaming provider pairs Peer5 with Flussonic streaming software to achieve new levels performance

One of the great things about working at Peer5 is that we’re trying to solve some pretty interesting technical challenges. This comes with the territory when your corporate mission statement is to “create the world’s largest CDN without deploying a single server”. We’re very ambitious and technical at heart and this means we love pushing the envelope – in terms of performance, efficiency, robustness, etc. – all the things that geeks love to geek out about.

It’s in this spirit of geekiness that we want to share the story of Alejandro Ferrari, one of our clients who recently achieved some staggering performance numbers using a combination of Flussonic’s streaming software with Peer5’s p2p CDN.

We’ve had the pleasure of working with Alejandro for about a year now. He is a highly experienced streaming engineer who works as an independent streaming consultant.

The setup

For this particular project, given his budget constraints, Alejandro needed to see how many concurrent streams he could support with just 2 mid-size origin servers. Each server had the following specs:

  • 2 x Intel Xeon(R) CPU E5645 @ 2.40GHz
  • 16 GB RAM
  • 1 x 10Gbps network card

Alejandro used Flussonic’s streaming software for this project. Before Peer5 was deployed, each server could handle a maximum of ~10,000 concurrent viewers – i.e., an average bit-rate of 1 Mbps per user X 10,000 users = 10 Gbps which is the limit of the network card.

First iteration: 10,000 to 40,000 streams per server

After deploying Peer5 (and without upgrading any hardware), Alejandro saw an immediate jump in capacity to 40,000 concurrent viewers per server. This was possible, of course, because of the Peer5 magic – we coordinate the viewers and have them send video chunks to each other instead of always fetching chunks from the server, thereby off-loading a large percentage of the HTTP requests that the servers would normally get from viewers, thereby enabling the servers to handle more concurrent viewers. For all the gory details, click here to see how Alejandro tuned his Linux set-up with Flussonic + Peer5 to achieve this goal.

While Alejandro was thrilled with the the 40K number, he noticed that the servers were only pushing 6 Gbps of traffic at peak – i.e., 60% of their capacity.

Flussonic #1

This made him think that there was still more efficiency to be extracted. And sure enough, after working with the Flussonic team to increase the maximum number of concurrent connections allowed, a limit that no other Flussonic customer had ever reached, Alejandro was ready to achieve yet another milestone.

Second iteration: 40,000 to 62,000 streams per server

After Flussonic implemented its optimizations (look for issue #2625 in the changelog), Alejandro was eager to test his servers again. This time, he was able to once again saturate his network cards with 10 Gbps of bandwidth but his concurrent stream count jumped from 40,000 to 62,200. Additionally, his buffering percentage decreased from 3.6% to 1.9%.

Flussonic #2

Third iteration: 62,000 to 200,000 streams per server

While most people would call it a day after achieving an 6X performance increase, Alejandro is always pushing for more and he asked us why he wasn’t hitting 200,000 concurrent viewers per server. Given that Peer5 was offloading ~95% of user requests at peak, a 20X increase in performance is the expected result. The answer is that not all of Alejandro’s viewers were accessing the stream from a WebRTC-enabled browser. As support for WebRTC grows, however, we expect to achieve 200 Gbps of throughput per server in the not too distant future. Now that is something to geek out about !!!

Are you maximizing the performance of your streaming hardware?

If you’d like to work with Alejandro and Peer5 to squeeze every last ounce of performance from your current streaming hardware, please contact us at or

Hadar Weiss

Peer5 Co-Founder and CEO

Subscribe to Peer5 Blog

Get the latest posts delivered right to your inbox.

or subscribe via RSS with Feedly!