ACMMM’19 Tutorial: A Journey towards Fully Immersive Media Access

ACM Multimedia 2019
October 21-25, 2019, Nice, France
https://www.acmmm.org/2019/

Note: exact date/time slot of this tutorial will be provided at a later stage

Lecturers
Christian Timmerer, Alpen-Adria-Universität Klagenfurt & Bitmovin, Inc.
Ali C. Begen, Ozyegin University and Networked Media

Abstract
Universal media access as proposed in the late 90s, early 2000 is now reality. Thus, we can generate, distribute, share, and consume any media content, anywhere, anytime, and with/on any device. A major technical breakthrough was the adaptive streaming over HTTP resulting in the standardization of MPEG-DASH, which is now successfully deployed in HTML5 environments thanks to corresponding media source extensions (MSE). The next big thing in adaptive media streaming is virtual reality applications and, specifically, omnidirectional (360°) media streaming, which is currently built on top of the existing adaptive streaming ecosystems. This tutorial provides a detailed overview of adaptive streaming of both traditional and omnidirectional media within HTML5 environments. The tutorial focuses on the basic principles and paradigms for adaptive streaming – both traditional and omnidirectional media – as well as on already deployed content generation, distribution, and consumption workflows. Additionally, the tutorial provides insights into standards and emerging technologies in the adaptive streaming space. Finally, the tutorial includes the latest approaches for immersive media streaming enabling 6DoF DASH through Point Cloud Compression (PCC) and concludes with open research issues and industry efforts in this domain.

Keywords: Omnidirectional media, HTTP adaptive streaming, over-the-top video, 360 video, virtual reality, immersive media access.

Learning Objectives
This tutorial consists of two main parts. In the first part, we provide a detailed overview of the HTML5 standard and show how it can be used for adaptive streaming deployments. In particular, we focus on the HTML5 video, media extensions, and multi-bitrate encoding, encapsulation and encryption workflows, and survey well-established streaming solutions. Furthermore, we present experiences from the existing deployments and the relevant de jure and de facto standards (DASH, HLS, CMAF) in this space. In the second part, we focus on omnidirectional (360-degree) media from creation to consumption as well as first thoughts on dynamic adaptive point cloud streaming. We survey means for the acquisition, projection, coding and packaging of omnidirectional media as well as delivery, decoding and rendering methods. Emerging standards and industry practices are covered as well (OMAF, VR-IF). Both parts present some of the current research trends, open issues that need further exploration and investigation, and various efforts that are underway in the streaming industry. Upon attending this tutorial, the participants will have an overview and understanding of the following topics:

  • Principles of HTTP adaptive streaming for the Web/HTML5
  • Principles of omnidirectional (360-degree) media delivery
  • Content generation, distribution and consumption workflows for traditional and omnidirectional media
  • Standards and emerging technologies in the adaptive streaming space
  • Current and future research on traditional and omnidirectional media delivery, specifically enabling 6DoF adaptive streaming through point cloud compression

ACM Multimedia attracts attendees that are quite knowledgeable in specific areas. However, not all are experts across multiple disciplines (such as the subject matter here) and only few are familiar with what is happening in the field and standards. Thus, we believe the proposed tutorial will be of interest to this year’s attendees as much as it did in the past.

Table of Contents
Part I: The HTML5 Standard and Adaptive Streaming

  • HTML5 video and media extensions
  • Survey of well-established streaming solutions
  • Multi-bitrate encoding, and encapsulation and encryption workflows
  • The MPEG-DASH standard, Apple HLS and the developing CMAF standard
  • Common issues in scaling and improving quality, multi-screen/hybrid delivery

Part II: Omnidirectional (360-degree) Media

  • Acquisition, projection, coding and packaging of 360-degree video
  • Delivery, decoding and rendering methods
  • The developing MPEG-OMAF and MPEG-I standards
  • Ongoing industry efforts, specifically towards 6DoF adaptive streaming

Speakers
Christian Timmerer received his M.Sc. (Dipl.-Ing.) in January 2003 and his Ph.D. (Dr.techn.) in June 2006 (for research on the adaptation of scalable multimedia content in streaming and constraint environments) both from the Alpen-Adria-Universität (AAU) Klagenfurt. He joined the AAU in 1999 (as a system administrator) and is currently an Associate Professor at the Institute of Information Technology (ITEC) within the Multimedia Communication Group. His research interests include immersive multimedia communication, streaming, adaptation, Quality of Experience, and Sensory Experience. He was the general chair of WIAMIS 2008, QoMEX 2013, MMSys 2016, and PV 2018 and has participated in several EC-funded projects, notably DANAE, ENTHRONE, P2P-Next, ALICANTE, SocialSensor, COST IC1003 QUALINET, and ICoSOLE. He also participated in ISO/MPEG work for several years, notably in the area of MPEG- 21, MPEG-M, MPEG-V, and MPEG-DASH where he also served as standard editor. In 2013, he cofounded Bitmovin (http://www.bitmovin.com/) to provide professional services around MPEG-DASH where he holds the position of the Chief Innovation Officer (CIO) – Head of Research and Standardization. He is a senior member of IEEE and member of ACM, specifically IEEE Computer Society, IEEE Communications Society, and ACM SIGMM. Dr. Timmerer was a guest editor of three special issues for the IEEE Journal on Selected Areas in Communications (JSAC) and currently serves as associate editor for IEEE Transactions on Multimedia. Further information available at http://blog.timmerer.com.

Ali C. Begen is the co-founder of Networked Media, a technology company that offers consulting services to industrial, legal and academic institutions in the IP video space. He has been a research and development engineer since 2001, and has broad experience in mathematical modeling, performance analysis, optimization, standards development, intellectual property and innovation. Between 2007 and 2015, he was with the Video and Content Platforms Research and Advanced Development Group at Cisco, where he designed and developed algorithms, protocols, products and solutions in the service provider and enterprise video domains. Currently, he is also affiliated with Ozyegin University, where he is teaching and advising students in the computer science department. Ali has a PhD in electrical and computer engineering from Georgia Tech. To date, he received a number of academic and industry awards, and was granted 30+ US patents. He held editorial positions in leading magazines and journals, and served in the organizing committee of several international conferences and workshops in the field. He is a senior member of both the IEEE and ACM. In 2016, he was elected distinguished lecturer by the IEEE Communications Society, and in 2018, he was re-elected for another two-year term. More details are at http://ali.begen.net.

This entry was posted in DASH. Bookmark the permalink.