In these days of social distancing, as thousands cloister at home to binge-watch TV over the internet, Stanford researchers have designed an algorithm that demonstrates a significant enhancement in streaming video technology.
This new algorithm, named Fugu, was engineered with the help of volunteer viewers who watched a video, served up by computer scientists who used machine learning to scrutinize this data move in real-time, in search of methods to reduce bugs and stalls.
In a scientific paper, the researchers describe how they developed an algorithm that pushes out only as much data as the viewer’s web connection can receive without degrading quality.
Lots of the prevailing systems for streaming video are based on one thing called the Buffer-Based Algorithm, referred to as BBA, which was built seven years ago by then-Stanford student Te-Yuan Huang, along with professors Nick McKeown and Ramesh Johari.
BBA asks the viewer’s device how much video it has in its buffer. For instance, if it has less than 5 seconds saved, the algorithm sends lower quality footage to guard against interruptions. If the buffer has more than 15 seconds saved, the algorithm sends the highest quality video potential. If the seconds fall in between, the algorithm adjusts the quality accordingly.