Streaming Video without Plugins The future of online media Jeff Tapper Digital

Slides:



Advertisements
Similar presentations
BrightAuthor v3.7 software and BrightSign XD v4.7 firmware
Advertisements

Using Multimedia on the Web Enhancing a Web Site with Sound, Video, and Applets.
CNIT 132 – Week 9 Multimedia. Working with Multimedia Bandwidth is a measure of the amount of data that can be sent through a communication pipeline each.
HTML5 Media API.
Chapter 6 Preparing and Publishing Applications. Chapter 6 Lessons 1.Publish movies 2.Reduce file size to optimize a movie 3.Create a preloader 4.Publish.
Dynamic Adaptive Streaming over HTTP – Design Principles and Standards Thomas Stockhammer, Qualcomm DASHDASH.
Will Law | Chief Media Architect | Akamai Optimizing the Black Box of HTML.
HTML 5 and CSS 3, Illustrated Complete Unit K: Incorporating Video and Audio.
UNIT K: INCORPORATING VIDEO AND AUDIO 1 Encoding: the process of transforming moving image and/or sound into a digital file. Each encoding method known.
1 Configuring Internet- related services (April 22, 2015) © Abdou Illia, Spring 2015.
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 27 – DASH (Dynamic Adaptive Streaming over HTTP) Klara Nahrstedt Spring 2012.
Windows Azure 4/15/2017 Building media streaming apps and sites without plug-ins using MPEG-DASH Daniel Schneider Senior Lead Program Manager, Microsoft.
Dynamic Adaptive Streaming over HTTP2.0. What’s in store ▪ All about – MPEG DASH, pipelining, persistent connections and caching ▪ Google SPDY - Past,
UNDERSTANDING JAVA APIS FOR MOBILE DEVICES v0.01.
DIS Multimedia Productions Flash Video Streaming June 5, 2007.
A New Computing Paradigm. Overview of Web Services Over 66 percent of respondents to a 2001 InfoWorld magazine poll agreed that "Web services are likely.
Quicktime Howell Istance School of Computing De Montfort University.
 What is Windows Azure Media Services  Reach Story w/ SDKs and Player Frameworks  Media applications on Windows  Monetize your content  Road to.
Mingfei Yan Program manager Windows Azure Media Services.
 What is Windows Azure Media Services  Reach Story w/ SDKs and Player Frameworks  Media applications on Windows 8  Web: a Flash player for Smooth.
COMPUTER TERMS PART 1. COOKIE A cookie is a small amount of data generated by a website and saved by your web browser. Its purpose is to remember information.
Form Handling, Validation and Functions. Form Handling Forms are a graphical user interfaces (GUIs) that enables the interaction between users and servers.
Audio and Video on the Web Sec 5-12 Part or all of this lesson was adapted from the University of Washington’s “Web Design & Development I” Course materials.
Session: 11. © Aptech Ltd. 2HTML5 Audio and Video / Session 11  Describe the need for multimedia in HTML5  List the supported media types in HTML5 
Using Multimedia on the Web
RTSP Real Time Streaming Protocol
CIS679: RTP and RTCP r Review of Last Lecture r Streaming from Web Server r RTP and RTCP.
Building video application for windows 8 with Windows Azure Media Services Mingfei Yan Program Manager Microsoft Corporation WCL332.
Video on the Web. The Evolution of web video formats… WebM (Supported by Google) Ogg (Supported by Theora) Mp4 (h264 video encoding) WebM (Supported by.
CHAPTER 18 INTEGRATING AUDIO AND VIDEO. LEARNING OBJECTIVES How the HTML 5 and tag pair can be used to include a video file within a webpage How video.
PLUG INS flash, quicktime, java applets, etc. Browser Plug-ins Netscape wanted a method to extend features of the browser became an unofficial standard.
HTTP Streaming bar BoF, IETF 79th HTTP Streaming Survey and Gap Analysis Ning Zong draft-zong-httpstreaming-gap-analysis-01.
APP205 Key technologies for building great video experience on Windows 8.
Computer Concepts 2014 Chapter 7 The Web and .
INF Web Design Using Multimedia on the Web Sound - Part 2.
CSCI 6962: Server-side Design and Programming Introduction to AJAX.
Eyeblaster Video Enhancements for RMP 6.1. Agenda Video Enhancements Goals Main features and Creative Process Flash Video Components Bandwidth Optimization.
London April 2005 London April 2005 Creating Eyeblaster Ads The Rich Media Platform The Rich Media Platform Eyeblaster.
London April 2005 London April 2005 Creating Eyeblaster Ads The Rich Media Platform The Rich Media Platform Eyeblaster.
CNIT 133 Interactive Web Pags – JavaScript and AJAX JavaScript Environment.
An Overview of MPEG-21 Cory McKay. Introduction Built on top of MPEG-4 and MPEG-7 standards Much more than just an audiovisual standard Meant to be a.
Digital Multimedia, 2nd edition Nigel Chapman & Jenny Chapman Chapter 17 This presentation © 2004, MacAvon Media Productions Multimedia and Networks.
Outline Overview Video Format Conversion Connection with An authentication Streaming media Transferring media.
WEB BASED DATA TRANSFORMATION USING XML, JAVA Group members: Darius Balarashti & Matt Smith.
The MPEG-DASH Standard for Multimedia Streaming Over the Internet Chih-Hsiang Chou Advisor: Prof Dr. Ho-Ting Wu Department of Computer Science and Information.
PLUG INS flash, quicktime, java applets, etc. Browser Plug-ins Netscape wanted a method to extend features of the browser became an unofficial standard.
XP Tutorial 8 Adding Interactivity with ActionScript.
HTML 5. Introduction In modern browsers, adding a video to your page is as easy as adding an image. No longer do you need to deal with special plug-ins.
Reading Flash. Training target: Read the following reading materials and use the reading skills mentioned in the passages above. You may also choose some.
Back to the Basics - Video Nick Kwiatkowski Michigan Flex Users Group.
Contents : What is Silverlight? Silverlight Overview Silverlight Toolkit Overview Timeline & Packaging Silverlight V1.0 & V1.1 Properties of V1.0 Properties.
Digital Multimedia, 2nd edition Nigel Chapman & Jenny Chapman Chapter 17 This presentation © 2004, MacAvon Media Productions Multimedia and Networks.
Web Components with Polymer Jeff Tapper Digital
2 If aliens came to this solar system and observed humans over the last several years, what would they think is the most significant benefits of the.
An Introduction.  Introduction  Logging in from D1  Raison d'être  RSS and Podcasting  DragonDrop is…  What does it do?  Upload  Available Output.
WHAT'S THE DIFFERENCE BETWEEN A WEB APPLICATION STREAMING NETWORK AND A CDN? INSTART LOGIC.
Advanced Tricks and Troubleshooting.  Introduction  Uploading File Bundles  DragonDrop Media Players  Embedded Media Player Sample HTML Code  Embedded.
Kaltura Player Platform 2.0 August Key Advantages of Player Platform High performance full featured lead with HTML5 Most players display in 1 second.
Live Global Sports Events
Enterprise Town Hall solution
Sabri Kızanlık Ural Emekçi
Data Virtualization Tutorial… CORS and CIS
Vocabulary Prototype: A preliminary sketch of an idea or model for something new. It’s the original drawing from which something real might be built or.
Chapter 4: HTML5 Media - <video> & <audio>
Vocabulary Prototype: A preliminary sketch of an idea or model for something new. It’s the original drawing from which something real might be built or.
Multimedia and Networks
Configuring Internet-related services
JavaScript & jQuery AJAX.
Gigabit measurements – quality, not (just) quantity
Network Controllable MP3 Player
Presentation transcript:

Streaming Video without Plugins The future of online media Jeff Tapper Digital

Who am I? Senior Consultant at Digital Primates – Building next generation client applications Built video applications for many of the most watched live broadcasts Developing Internet applications for 19 years Author of 12 books on Internet technologies

Who are you? ?

Agenda Video and the Internet today Understanding HTTP Streaming What are the Streaming options without a plugin? What is DASH What is DASH-264 Making it work in a browser Questions

Online video Options Progressive Download Real Time Protocols (RTP, RTMP, RTSP, etc) HTTP Streaming (HDS, HLS, Smooth Streaming, etc)

The challenge Most agree that HTTP Streaming is the most efficient choice Different devices support different streaming protocols No one standard is currently supported ubiquitously Results in media being served in several different formats to support the broadest range of devices

What do browsers support? Unfortunately, Progressive Download is the only ubiquitously supported option Different Browsers support different video codec’s – H.264 – webM – VP8 – Etc. Safari (iOs and MacOS only) natively supports HLS MediaSource Extensions in Chrome and IE11

MediaSource Extensions (MSE) MSE allow for pieces (segments) of media to be handed to the HTML5 video tag’s buffer directly. This enables HTTP Streaming in HTML Not universally supported, yet. Currently (as of January 2014) a Candidate Recommendation to the HTML Working Group

What is MPEG-DASH  DASH – Dynamic Adaptive Streaming via HTTP  International open standard, developed and published by ISO  Addresses both simple and advanced use cases  Enables highest-quality multiscreen distribution and efficient dynamic adaptive switching  Enables reuse of existing content, devices and infrastructure  Attempts to unify to a single standard for HTTP Streaming

DASH and codecs The DASH specification is codec agnostic Any existing or future codec can work with DASH DASH manifest describes which codec is used Allows ability for a single manifest to describe several different versions in different codecs

DASH264 H.264 is dominant format today Many vendors and service providers are committed to supporting/enabling DASH264 Provides support for today’s requirements such as DRM H.264 is backed by rigorous testing and conformance

DASH Industry Forum Addressing the dramatic growth of broadband video by recommending a universal delivery format that provides end users with the best possible media experience by dynamically adapting to changing network conditions.

DASH Industry Forum Objectives: – promote and catalyze market adoption of MPEG-DASH – publish interoperability and deployment guidelines – facilitate interoperability tests – collaborate with standard bodies and industry consortia in aligning ongoing DASH standards development and the use of common profiles across industry organizations Over 65 members Visit for more informationhttp://dashif.org Released the DASH/264 standard

Building a DASH player We have built DASH players for several different platforms – Flash – Android – HTML5/JavaScript (dash.js) DASH.js is available as an open source project (bsd3) on github DASH.js is the reference player for the DASH Industry Forum (dashif.org)

How to play a DASH Stream Download Manifest Parse Manifest Determine optimal bandwidth for client Initialize for bandwidth Download Segment Hand segment to MSE Check Bandwidth to determine if change is necessary

Understanding DASH structure Three types of files – Manifest (.mpd) XML file describing the segments – Initialization file Contains headers needed to decode bytes in segments – Segment Files Contains playable media Includes: – 0…many video tracks – 0…many audio tracks

DASH Manifest Manifest contains: – Root node – 1 or more periods Periods contain 1 adaptation set per video stream and Periods contain 1 adaptation set per audio stream Adaptation Sets contain: – Content Composition nodes (for each video or audio track) – 1 or more Representation node » Each representation describes a single bitrate » Representations contain data on finding the actual segments » Different ways a representation can describe segments

Describing Representations SegmentBase – Describes a stream with only a single Segment per bitrate – Can be used for Byte Range Requests SegmentList – A SegmentList will contain a specific list of each SegmentURL (individual HTTP packet with media data) – Can be used for Byte Range Requests SegmentTemplate – Defines a known url for the fragment with wildcards resolved at runtime to request a segments (see bbb.mpd) – Alternatively, can specify a list of segments based on duration

SegmentList

SegmentTemplate fixed segment duration <Representation id="1" mimeType="video/mp4“ codecs="avc f" width="1280" height="720“ startWithSAP="1" bandwidth=" "> <SegmentTemplate timescale="1000" duration="13809" media="bbb_seg_BigBuckBunny_720p_1800kbps_44khz_track1$Number$.m4s" startNumber="1"/>

SegmentTemplate variable segment duration <AdaptationSet group="2" mimeType="video/mp4" par="16:9“ minBandwidth="475000“ maxBandwidth=" " minWidth="176" maxWidth="1680" minHeight="99" maxHeight="944“ segmentAlignment="true“ startWithSAP="1"> <SegmentTemplate timescale="1000" initialization="dash/ateam-video=$Bandwidth$.dash" media="dash/ateam-video=$Bandwidth$-$Time$.dash"> …

dash.js player

dash.js is a free open source player Code available on github Currently the base of several different production players Recent uses include: – Recent BBC live broadcasts (3/14-3/28) – Wowza – EZDRM – And more!

How to play a DASH Stream Download Manifest Parse Manifest Determine optimal bandwidth for client Initialize for bandwidth Download Segment Hand segment to MSE Check Bandwidth to determine if change is necessary

Tools used by dash.js Core Player Q – Asynchronous handling with promises Dijon – DI / IOC Jasmine – unit tests Web Site JQuery – DOM manipulation Flat-ui – UI elements Flot – Charting Kendo - Components

Class Structure The player is divided into two main packages. streaming – Contains the classes responsible for creating and populating the MediaSource buffers. These classes are intended to be abstract enough for use with any segmented stream (such as DASH, HLS, HDS and MSS). dash – Contains the classes responsible for making decisions specifically related to Dash.

streaming package

MediaPlayer.js Exposes the top level functions and properties to the developer (play, autoPlay, isLive, abr quality, and metrics). The manifest URL and the HTML Video object as passed to the MediaPlayer.

Context.js The dependency mapping for the stream package. The context is passed into the MediaPlayer object allowing for different MediaPlayer instances to use different mappings.

Stream.js Loads/refreshes the manifest. Create SourceBuffers from MediaSource. Create BufferManager classes to manage SourceBuffers. Responds to events from HTML Video object. For a live stream, the live edge is calculated and passed to the BufferController instances.

Debug.js Convenience class for logging methods. Default implementation is to just use console.log(). Extension point for tapping into logging messages.

BufferController.js Responsible for loading fragments and pushing the bytes into the SourceBuffer. Once play() has been called a timer is started to check the status of the bytes in the buffer. If the amount of time left to play is less than Manifest.minBufferTime the next fragment is loaded. Records metrics related to playback.

FragmentLoader.js Responsible for loading fragments. Loads requests sequentially. ManifestLoader.js Responsible for loading manifest files. Returns the parsed manifest object.

AbrController.js Responsible for deciding if the current quality should be changed. The stream metrics are passed to a set of ‘rules’. Methods:  getPlaybackQuality(type, data)  type – The type of the data (audio/video).  data – The stream data.

DownloadRatioRule.js Validates that fragments are being downloaded in a timely manner. Compares the time it takes to download a fragment to how long it takes to play out a fragment. If the download time is considered a bottleneck the quality will be lowered.

InsufficientBufferRule.js Validates that the buffer doesn’t run dry during playback. If the buffer is running dry continuously it likely means that the player has a processing bottleneck (video decode time is longer than playback time).

LimitSwitchesRule.js Watches for competing rules to avoid constant bitrate switches. If two or more rules are causing switches too often this rule will limit the switches to give a better overall playback experience.

dash package

DashContext.js Defines dependency mapping specific to the dash package. – Parser – Index Handler – Manifest Extensions

DashParser.js Converts the manifest to a JSON object. Converts duration and datetime strings into number/date objects. Manages inheritance fields. – Many fields are inherited from parent to child nodes in DASH. – For example, a BaseURL can be defined in the node and all nodes inherit that value.

DashHandler.js Responsible for deciding which fragment URL should be loaded. Methods:  getInitRequest(quality) – Returns an initialization request for a given quality, if available.  getSegmentRequestForTime(time, quality) – Returns a fragment URL to load for a given quality and a given time. Returns a Stream.vo.SegmentRequest object.  getNextSegmentRequest(quality) – Returns the next fragment URL to load. Assumes that getSegmentRequestForTime() has already been called.  getCurrentTime (quality) – Returns the time for the last loaded fragment index.

DashHandler.js (cont’d) Uses available information in the manifest ( SegmentList, SegmentTemplate, SegmentBase ). When using a single, non-fragmented mp4 file the SIDX box will be loaded to determine byte ranges for segments.

Flow 1.Create the Context and MediaPlayer instances. var context = new Dash.di.DashContext(), player = new MediaPlayer(context); 2. Initialize MediaPlayer and set manifest URL. player.startup(); player.setIsLive(false); player.attachSource(manifest_url); 3. Attach HTML Video element. video = document.querySelector(".dash-video-player video"), player.autoPlay = true; player.attachView(video);

2.Call play () on the MediaPlayer (if autoPlay = false ). 3.The Stream object will be created and initialized with the manifest URL. 4.The manifest is loaded and then parsed. 5.MediaSource, SourceBuffers, and BufferControllers are created. – Create one BufferController per stream type (usually video and audio). 6.Set the duration of the MediaSource to the duration of the manifest (or infinity for a live stream). 7.If the stream is live, calculate the live edge. 8.Call play () on the HTML video element. 9.The BufferManager instances create a timer. When the timer ticks the state of the buffers is checked.

BufferManager.validate() 1.Check to see if the buffers need more data. Must be in a playing state. Must not already be loading data. Must require more data to be buffered. amountBuffered < manifest.minBufferTime 2.If automatic ABR is enabled check to see if the bitrate should be changed. Ask AbrController for the new quality. Rules will determine which bitrate to change to. 3.If initial playback, seeking, or the bitrate has changed load the initialization fragment (if available).

4.Ask the IndexHandler for the next fragment request. If seeking pass the seek time to the IndexHandler. Otherwise ask for the ‘next’ fragment. Pass the bitrate to the IndexHandler. 6.The IndexHandler returns a SegmentRequest indicating what action the BufferManager should take next. “download” – Download and append the fragment to the buffer. “stall” – Wait because the IndexHandler is not ready. “complete” – Signal that the stream has completed playback. 7.Repeat.

Resources DASH Industry Forum – – Reference Player ( Reference Player Source Code – HTML Extensions – MSE: – EME: Twitter

Questions? ?