Streaming media is multimedia that is constantly received by and presented to an end-user while being delivered by a provider. The verb to stream refers to the process of delivering or obtaining media in this manner. Streaming refers to the delivery method of the medium, rather than the medium itself. Distinguishing delivery method from the media distributed applies specifically to telecommunications networks, as most of the delivery systems are either inherently streaming (e.g. radio, television, streaming apps) or inherently non-streaming (e.g. books, video cassettes, audio CDs). There are challenges with streaming content on the Internet. For example, users whose Internet connection lacks sufficient bandwidth may experience stops, lags, or slow buffering of the content. And users lacking compatible hardware or software systems may be unable to stream certain content.
Live streaming is the delivery of Internet content in real-time much as live television broadcasts content over the airwaves via a television signal. Live internet streaming requires a form of source media (e.g. a video camera, an audio interface, screen capture software), an encoder to digitize the content, a media publisher, and a content delivery network to distribute and deliver the content. Live streaming does not need to be recorded at the origination point, although it frequently is.
Streaming is an alternative to file downloading, a process in which the end-user obtains the entire file for the content before watching or listening to it. Through streaming, an end-user can use their media player to start playing digital video or digital audio content before the entire file has been transmitted. The term "streaming media" can apply to media other than video and audio, such as live closed captioning, ticker tape, and real-time text, which are all considered "streaming text".
Elevator music was among the earliest popular music available as streaming media; nowadays Internet television is a common form of streamed media. Some popular streaming services include Netflix, Disney+, Hulu, Prime Video, the video sharing website YouTube, and other sites which stream films and television shows; Apple Music and Spotify, which stream music; and the video game live streaming site Twitch.
In the early 1920s, George O. Squier was granted patents for a system for the transmission and distribution of signals over electrical lines, which was the technical basis for what later became Muzak, a technology streaming continuous music to commercial customers without the use of radio.
Attempts to display media on computers date back to the earliest days of computing in the mid-20th century. However, little progress was made for several decades, primarily due to the high cost and limited capabilities of computer hardware. From the late 1980s through the 1990s, consumer-grade personal computers became powerful enough to display various media. The primary technical issues related to streaming were having enough CPU and bus bandwidth to support the required data rates, achieving real-time computing performance required to prevent buffer underrun and enable smooth streaming of the content. However, computer networks were still limited in the mid-1990s, and audio and video media were usually delivered over non-streaming channels, such as playback from a local hard disk drive or CD-ROMs on the end user's computer.
In 1990 the first commercial Ethernet switch was introduced by Kalpana, which enabled the more powerful computer networks that lead to the first streaming video solutions used by schools and corporations.
Multimedia compression
Practical streaming media was only made possible with advances in data compression, due to the impractically high bandwidth requirements of uncompressed media. Raw digital audio encoded with pulse-code modulation (PCM) requires a bandwidth of 1.4 Mbit/s for uncompressed CD audio, while raw digital video requires a bandwidth of 168 Mbit/s for SD video and over 1000 Mbit/s for FHD video.
The most important compression technique that enabled practical streaming media is the discrete cosine transform (DCT), a form of lossy compression first proposed in 1972 by Nasir Ahmed, who developed the algorithm with T. Natarajan and K. R. Rao at the University of Texas in 1973. The DCT algorithm formed the basis for the first practical video coding format, H.261, in 1988.It was initially used for online video conferencing. It was followed by more popular DCT-based video coding standards, most notably MPEG video formats from 1991 onwards.
The DCT algorithm was adapted into the modified discrete cosine transform (MDCT) by J. P. Princen, A. W. Johnson and A. B. Bradley at the University of Surrey in 1987. The MDCT algorithm is fundamental to the MP3 audio format introduced in 1994 and especially the more widely used Advanced Audio Coding (AAC) format introduced in 1999.
Late 1990s to early 2000s
During the late 1990s and early 2000s, users had increased access to computer networks, especially the Internet. During the early 2000s, users had access to increased network bandwidth, especially in the "last mile". These technological improvements facilitated the streaming of audio and video content to computer users in their homes and workplaces. There was also an increasing use of standard protocols and formats, such as TCP/IP, HTTP, HTML as the Internet became increasingly commercialized, which led to an infusion of investment into the sector.
The band Severe Tire Damage was the first group to perform live on the Internet. On June 24, 1993, the band was playing a gig at Xerox PARC while elsewhere in the building, scientists were discussing new technology (the Mbone) for broadcasting on the Internet using multicasting. As proof of PARC's technology, the band's performance was broadcast and could be seen live in Australia and elsewhere. In a March 2017 interview, band member Russ Haines stated that the band had used approximately "half of the total bandwidth of the internet" to stream the performance, which was a 152-by-76 pixel video, updated eight to twelve times per second, with audio quality that was "at best, a bad telephone connection".
Microsoft Research developed Microsoft TV application compiled under Microsoft Windows Studio Suite and tested in conjunction with Connectix QuickCam. RealNetworks pioneered the broadcast of a baseball game between the New York Yankees and the Seattle Mariners over the Internet in 1995. The first symphonic concert on the Internet—a collaboration between the Seattle Symphony and guest musicians Slash, Matt Cameron, and Barrett Martin—took place at the Paramount Theater in Seattle, Washington, on November 10, 1995. Word Magazine featured the first ever streaming soundtracks on the Internet when it launched in 1995.
Metropolitan Opera Live in HD streams live performances of the Metropolitan Opera. For the 2013–2014 season, ten operas were transmitted via satellite into at least two thousand theaters in sixty-six countries.
Etymology
The term "streaming" was first used for tape drives manufactured by Data Electronics Inc. that were meant to slowly ramp up and run for the entire track; slower ramp times lowered drive costs. "Streaming" was applied in the early 1990s as a better description for video on demand and later live video on IP networks. It was first done by Starlight Networks for video streaming and Real Networks for audio streaming. Such video had previously been referred to by the misnomer "store and forward video."
Business developments
The first commercial streaming product appeared in late 1992 and was named StarWorks.[16] StarWorks enabled on-demand MPEG-1 full-motion videos to be randomly accessed on corporate Ethernet networks. Starworks was from Starlight Networks, who also pioneered live video streaming on Ethernet and via Internet Protocol over satellites with Hughes Network Systems. Other early companies who created streaming media technology include RealNetworks (then known as Progressive Networks) and Protocomm both prior to wide spread World Wide Web usage and once the web became popular in the late 90s, streaming video on the internet blossomed from startups such as VDOnet, acquired by RealNetworks, and Precept, acquired by Cisco.
Microsoft developed a media player known as ActiveMovie in 1995 that allowed streaming media and included a proprietary streaming format, which was the precursor to the streaming feature later in Windows Media Player 6.4 in 1999. In June 1999 Apple also introduced a streaming media format in its QuickTime 4 application. It was later also widely adopted on websites along with RealPlayer and Windows Media streaming formats. The competing formats on websites required each user to download the respective applications for streaming and resulted in many users having to have all three applications on their computer for general compatibility.
In 2000 Industryview.com launched its "world's largest streaming video archive" website to help businesses promote themselves. Webcasting became an emerging tool for business marketing and advertising that combined the immersive nature of television with the interactivity of the Web. The ability to collect data and feedback from potential customers caused this technology to gain momentum quickly.
Around 2002, the interest in a single, unified, streaming format and the widespread adoption of Adobe Flash prompted the development of a video streaming format through Flash, which was the format used in Flash-based players on video hosting sites. The first popular video streaming site, YouTube, was founded by Steve Chen, Chad Hurley and Jawed Karim in 2005. It initially used a Flash-based player, which played MPEG-4 AVC video and AAC audio, but now defaults to HTML5 video. Increasing consumer demand for live streaming has prompted YouTube to implement a new live streaming service to users. The company currently also offers a (secured) link returning the available connection speed of the user.
The Recording Industry Association of America (RIAA) revealed through its 2015 earnings report that streaming services were responsible for 34.3 percent of the year's total music industry's revenue, growing 29 percent from the previous year and becoming the largest source of income, pulling in around $2.4 billion. US streaming revenue grew 57 percent to $1.6 billion in the first half of 2016 and accounted for almost half of industry sales.
Streaming wars
The term "streaming wars" was coined to discuss the new era of competition between video streaming services such as Netflix, Amazon Prime Video, Hulu, HBO Max and Apple TV+.
The audio stream is compressed to make the file size smaller using an audio coding format such as MP3, Vorbis, AAC or Opus. The video stream is compressed using a video coding format to make the file size smaller. Video coding formats include H.264, HEVC, VP8 or VP9. Encoded audio and video streams are assembled in a container "bitstream" such as MP4, FLV, WebM, ASF or ISMA. The bitstream is delivered from a streaming server to a streaming client (e.g., the computer user with their Internet-connected laptop) using a transport protocol, such as Adobe's RTMP or RTP. In the 2010s, technologies such as Apple's HLS, Microsoft's Smooth Streaming, Adobe's HDS and non-proprietary formats such as MPEG-DASH have emerged to enable adaptive bitrate streaming over HTTP as an alternative to using proprietary transport protocols. Often, a streaming transport protocol is used to send video from an event venue to a "cloud" transcoding service and CDN, which then uses HTTP-based transport protocols to distribute the video to individual homes and users.[49] The streaming client (the end user) may interact with the streaming server using a control protocol, such as MMS or RTSP.
The quality of the interaction between servers and users is based on the workload of the streaming service; as more users attempt to access a service, the more quality is affected unless there is enough bandwidth or the host is using enough proxy networks. Deploying clusters of streaming servers is one such method where there are regional servers spread across the network, managed by a singular, central server containing copies of all the media files as well as the IP addresses of the regional servers. This central server then uses load balancing and scheduling algorithms to redirect users to nearby regional servers capable of accommodating them. This approach also allows the central server to provide streaming data to both users as well as regional servers using FFMpeg libraries if required, thus demanding the central server to have powerful data-processing and immense storage capabilities. In return, workloads on the streaming backbone network are balanced and alleviated, allowing for optimal streaming quality.
Protocol challenges
Designing a network protocol to support streaming media raises many problems. Datagram protocols, such as the User Datagram Protocol (UDP), send the media stream as a series of small packets. This is simple and efficient; however, there is no mechanism within the protocol to guarantee delivery. It is up to the receiving application to detect loss or corruption and recover data using error correction techniques. If data is lost, the stream may suffer a dropout. The Real-time Streaming Protocol (RTSP), Real-time Transport Protocol (RTP) and the Real-time Transport Control Protocol (RTCP) were specifically designed to stream media over networks. RTSP runs over a variety of transport protocols, while the latter two are built on top of UDP.
Another approach that seems to incorporate both the advantages of using a standard web protocol and the ability to be used for streaming even live content is adaptive bitrate streaming. HTTP adaptive bitrate streaming is based on HTTP progressive download, but contrary to the previous approach, here the files are very small, so that they can be compared to the streaming of packets, much like the case of using RTSP and RTP. Reliable protocols, such as the Transmission Control Protocol (TCP), guarantee correct delivery of each bit in the media stream. However, they accomplish this with a system of timeouts and retries, which makes them more complex to implement. It also means that when there is data loss on the network, the media stream stalls while the protocol handlers detect the loss and retransmit the missing data. Clients can minimize this effect by buffering data for display. While delay due to buffering is acceptable in video on demand scenarios, users of interactive applications such as video conferencing will experience a loss of fidelity if the delay caused by buffering exceeds 200 ms.
Unicast protocols send a separate copy of the media stream from the server to each recipient. Unicast is the norm for most Internet connections, but does not scale well when many users want to view the same television program concurrently. Multicast protocols were developed to reduce the server/network loads resulting from duplicate data streams that occur when many recipients receive unicast content streams independently. These protocols send a single stream from the source to a group of recipients. Depending on the network infrastructure and type, multicast transmission may or may not be feasible. One potential disadvantage of multicasting is the loss of video on demand functionality. Continuous streaming of radio or television material usually precludes the recipient's ability to control playback. However, this problem can be mitigated by elements such as caching servers, digital set-top boxes, and buffered media players.
IP Multicast provides a means to send a single media stream to a group of recipients on a computer network. A multicast protocol, usually Internet Group Management Protocol, is used to manage delivery of multicast streams to the groups of recipients on a LAN. One of the challenges in deploying IP multicast is that routers and firewalls between LANs must allow the passage of packets destined to multicast groups. If the organization that is serving the content has control over the network between server and recipients (i.e., educational, government, and corporate intranets), then routing protocols such as Protocol Independent Multicast can be used to deliver stream content to multiple Local Area Network segments. As in mass delivery of content, multicast protocols need much less energy and other resources, widespread introduction of reliable multicast (broadcast-like) protocols and their preferential use, wherever possible, is a significant ecological and economic challenge. Peer-to-peer (P2P) protocols arrange for prerecorded streams to be sent between computers. This prevents the server and its network connections from becoming a bottleneck. However, it raises technical, performance, security, quality, and business issues.
Applications and marketing
Useful – and typical – applications of the "streaming" concept are, for example, long video lectures performed "online" on the Internet. An advantage of this presentation is that these lectures can be very long, although they can always be interrupted or repeated at arbitrary places. There are also new marketing concepts. For example, the Berlin Philharmonic Orchestra sells Internet live streams of whole concerts, instead of several CDs or similar fixed media, by their so-called "Digital Concert Hall" using YouTube for "trailing" purposes only. These "online concerts" are also spread over a lot of different places – cinemas – at various places on the globe. A similar concept is used by the Metropolitan Opera in New York. There also is a livestream from the International Space Station. In video entertainment, video streaming platforms like Netflix, Hulu, and Disney+ are mainstream elements of the media industry.
Recording
Media that is live streamed can be recorded through certain media players such as VLC player, or through the use of a screen recorder. Live-streaming platforms such as Twitch may also incorporate a video on demand system that allows automatic recording of live broadcasts so that they can be watched later. The popular site, YouTube also has recordings of live broadcasts, including television shows aired on major networks. These streams have the potential to be recorded by anyone who has access to them, whether legally or otherwise.
Copyright
Streaming copyrighted content can involve making infringing copies of the works in question. The recording and distribution of streamed content is also an issue for many companies that rely on revenue based on views or attendance.
Greenhouse gas emissions
The net greenhouse gas emissions from streaming music have been estimated at between 200 and 350 million kilograms per year in the United States, according to a 2019 study. This is an increase from emissions in the pre-digital music period, which were estimated at "140 million kilograms in 1977, 136 million kilograms in 1988, and 157 million in 2000."
There are several ways to decrease greenhouse gas emissions associated with streaming music, including efforts to make data centers carbon neutral, by converting to electricity produced from renewable sources. On an individual level, purchase of a physical CD may be more environmentally friendly if it is to be played more than 27 times. Another option for reducing energy use can be downloading the music for offline listening, to reduce the need for streaming over distance. The Spotify service has a built-in local cache to reduce the necessity of repeating song streams.
Reference: https://en.wikipedia.org/
See also
- Comparison of music streaming systems
- Comparison of streaming media systems
- Comparison of video streaming aggregators
- Comparison of video hosting services
- Content delivery platform
- Digital Living Network Alliance (DLNA)
- Digital television
- IPTV
- List of streaming media systems
- Live streaming
- Live streaming world news
- M3U playlists
- Over-the-top media service
- P2PTV
- Protection of Broadcasts and Broadcasting Organizations Treaty
- Push technology
- Real-time data
- Stream processing
- Stream recorder
- Web syndication