Skip to content

Latest commit

 

History

History
211 lines (150 loc) · 14 KB

living-doc-mops-streaming-opcons.md

File metadata and controls

211 lines (150 loc) · 14 KB

Introduction

This document contains links and references to resources that the IETF Media Operations community deemed potentially useful to operators of streaming media services or network operators for networks that carry traffic from such services.

This document is a companion to draft-ietf-mops-streaming-opcons, please read that document for a more complete explanation.

Document Maintenance

This living document is actively maintained by participants in the Media Operations Working Group of the Internet Engineering Task Force (IETF). New resources may be added and old resources may be removed from the sections below with consensus of the working group, and URLs may be updated or removed by the document editors if the resources they refer to move locations or get removed by their publishers.

An archive with older versions of this document can be found at:

https://github.com/ietf-wg-mops/draft-ietf-mops-streaming-opcons/commits/main/living-doc-mops-streaming-opcons.md

Status of This Memo

This living document is submitted in full conformance with the provisions of BCP 78 and BCP 79.

Copyright Notice

Copyright (c) 2022 IETF Trust and the persons identified as the document authors. All rights reserved.

This document is subject to BCP 78 and the IETF Trust's Legal Provisions Relating to IETF Documents (https://trustee.ietf.org/license-info) in effect on the date of publication of this document. Please review these documents carefully, as they describe your rights and restrictions with respect to this document. Code Components extracted from this document must include Revised BSD License text as described in Section 4.e of the Trust Legal Provisions and are provided without warranty as described in the Revised BSD License.

Industry Terminology

Surveys and Tutorials

Encoding

The following papers describe how video is encoded, different video encoding standards and tradeoffs in selecting encoding parameters.

Packaging

The following papers summarize the methods for selecting packaging configurations such as the resolution-bitrate pairs, segment durations, use of constant vs. variable-duration segments, etc.

Content Delivery

The following links describe some of the issues and solutions regarding the interconnecting of the content delivery networks.

ABR Algorithms

The two surveys describe and compare different rate-adaptation algorithms in terms of different metrics like achieved bitrate/quality, stall rate/duration, bitrate switching frequency, fairness, network utilization, etc.

Low-Latency Live Adaptive Streaming

The following papers describe the peculiarities of adaptive streaming in low-latency live streaming scenarios.

Server/Client/Network Collaboration

The following papers explain the benefits of server and network assistance in client-driven streaming systems. There is also a good reference about how congestion affects video quality and how rate control works in streaming applications.

QoE Metrics

The following papers describe various QoE metrics one can use in streaming applications.

Point Clouds and Immersive Media

The following papers explain the latest developments in the immersive media domain (for video and audio) and the developing standards for such media.

Open-Source Tools

Technical Events

List of Organizations Working on Streaming Media

Topics to Keep an Eye on

5G and Media

5G new radio and systems technologies provide new functionalities for video distribution. 5G targets not only smartphones, but also new devices such as augmented reality glasses or automotive receivers. Higher bandwidth, lower latencies, edge and cloud computing functionalities, service-based architectures, low power consumption, broadcast/multicast functionalities and other network functions come hand in hand with new media formats and processing capabilities promising better and more consistent quality for traditional video streaming services as well as enabling new experiences such as immersive media and augmented realities.

Ad Insertion

Ads can be inserted at different stages in the streaming workflow, on the server side or client side. The DASH-IF guidelines detail server-side ad-insertion with period replacements based on manipulating the manifest. HLS interstitials provide a similar approach. The idea is that the manifest can be changed and point to a sub-playlist of segments, possibly located on a different location. This approach results in efficient resource usage in the network, as duplicate caching is avoided, but some intelligence at the player is needed to deal with content transitions (e.g., codec changes, timeline gaps, etc.). Player support for such content is gradually maturing. Other important technologies for ad insertion include signalling of ads and breaks that is still typically based on SCTE-35 for HLS and SCTE-214 for DASH. Such signals provide useful information for scheduling the ads and contacting ad servers. The usage of SCTE-35 for ad insertion is popular in the broadcast industry, while the exact usage in the OTT space is still being discussed in SCTE. Another important technology is identification of ads, such as based on ad-id or other commercial entities that provide such services. The identification of the ad in a manifest or stream is usually standardized by SMPTE. Other key technologies for ad insertion include tracking of viewer impressions, usually based on Video Ad Serving Template (VAST) defined by IAB.

Contribution and Ingest

There are different contribution and ingest specifications dealing with different use cases. A common case is contribution that previously happened over satellite to a broadcast or streaming headend. RIST and SRT are examples of such contribution protocols. Within a streaming headend the encoder and packager/CDN may have an ingest/contribution interface as well. This is specified by the DASH-IF Ingest.

Synchronized Encoding and Packaging

Practical streaming headends need redundant encoders and packagers to operate without glitches and blackouts. The redundant operation requires synchronization between two or more encoders and also between two or more packagers that possibly handle different inputs and outputs, generating compatible inter-changeable output representations. This problem is important for anyone developing a streaming headend at scale, and the synchronization problem is currently under discussion in the wider community. Follow the developments at: https://sites.google.com/view/encodersyncworkshop/home

WebRTC-Based Streaming

WebRTC is increasingly being used for streaming of time-sensitive content such as live sporting events. Innovations in cloud computing allow implementers to efficiently scale delivery of content using WebRTC. Support for WebRTC communication is available on all modern web browsers and is available on native clients for all major platforms.