Ask any sports fan and they’ll tell you that they have a love/hate relationship with live streaming. Most of the time, live streaming works like a charm, but there’s always that odd occasion when things don’t quite work out how they should.
BBC testing adaptive streaming algorithms for iPlayer |
Unfortunately, that odd occasion always seems to be in a match your team can’t afford to lose. You can picture it now – your team’s desperately chasing the game, trailing by a single goal, when suddenly one of your players is put clean through with just the keeper to beat. You’re on the edge of your seat and you know he can’t miss when – AARRGH! – the stream just freezes at that critical moment.
These incredibly frustrating moments have happened to us all, but if the BBC has its way, the sight of that annoying little ‘buffering’ wheel may soon be a thing of the past.
The UK public service broadcaster is hard at work on a new technology called adaptive streaming for its popular BBC iPlayer service, which aims to skip around the problem of iffy internet connections by encoding media files at multiple bitrates, each of which offers a different level of picture quality (the faster the bit-rate, the better quality the images are). The idea is that your streaming client will then be able to select the most appropriate bitrate for your internet connection in real-time, constantly changing as the quality of your connection fluctuates, thus ensuring that your live stream is never again interrupted.
One of the biggest challenges for the Beeb is optimising adaptive streaming so that it can perform on multiple different network environments at the same time; the challenge involves creating dozens of different algorithms that can identify the characteristics of multiple networks requesting a stream, and then decide which bit-rate to request for each one, so that the stream is delivered uninterrupted.
This might sound simple enough but it isn’t, and not just because wireless networks are notoriously ‘iffy’, with the strength of the web connection fluctuating all the time. The other problem is that wired networks behave in a completely different way to wireless networks. The TCP protocol, which works to ensure the reliability of data transmissions over the web, was originally designed for wired networks. In such networks, packet loss is a sign of congestion, and so the TCP will decrease the flow of data being sent in order to alleviate the traffic jam. However, with wireless networks, packet loss is usually down to some random fluctuation in connection strength that soon corrects itself, and so decreasing the data flow would mean shifting to a lower bit-rate quite unnecessarily.
Competing users presents the BBC with another problem. Streaming clients like the iPlayer have a tendency to become very confused when multiple people are using adaptive streaming, and can accidentally leave just one streaming client hogging all of the bandwidth, causing problems for everyone else, or else the quality of everyone’s stream will be all over the place, up and down every few seconds as it struggles to cope with constantly changing everyone’s bandwidth allocations.
To overcome these problems, the BBC has created a number of residential broadband network models so that it can test out a variety of adaptive streaming algorithms, and eventually come up with something that can do the job consistently for thousands of different users at once. The broadcaster admitted in its research and development blog that the popularity of IPTV (internet protocol television) is growing all the time, and it’s no longer something the BBC can ignore, even though the vast majority of its content is delivered through more conventional broadcasting methods. What with mobile devices and iPlayer so popular nowadays, an increasing number of viewers will be using smartphones, tablets and even smart HDTV sets to watch TV programmes on demand, and the organisation is doing everything it can to be ready in time.
Source: BBC R&D Blog