A user account is required in order to edit this wiki, but we've had to disable public user registrations due to spam.

To request an account, ask an autoconfirmed user on Chat (such as one of these permanent autoconfirmed members).

Video Metrics: Difference between revisions

From WHATWG Wiki
Jump to navigation Jump to search
m (Added description for webkit properties.)
(a bit of page formatting)
Line 2: Line 2:




'''Requirements'''
== Requirements ==
 
 
For several reasons, we need to expose the performance of media elements to JavaScript.
For several reasons, we need to expose the performance of media elements to JavaScript.


Line 10: Line 12:




'''Collection of Proposals/Implementations'''
== Collection of Proposals/Implementations ==


Mozilla have implemented the following [http://blog.pearce.org.nz/2011/03/html5-video-painting-performance.html statistics into Firefox]:
'''Mozilla''' have implemented the following [http://blog.pearce.org.nz/2011/03/html5-video-painting-performance.html statistics into Firefox]:


* mozParsedFrames - number of frames that have been demuxed and extracted out of the media.
* mozParsedFrames - number of frames that have been demuxed and extracted out of the media.
Line 21: Line 23:




Webkit have [https://bugs.webkit.org/show_bug.cgi?id=53322 implemented] these:
'''Webkit''' have [https://bugs.webkit.org/show_bug.cgi?id=53322 implemented] these:


* webkitAudioBytesDecoded - number of audio bytes that have been decoded.
* webkitAudioBytesDecoded - number of audio bytes that have been decoded.
Line 29: Line 31:




JW Player (using [http://help.adobe.com/nl_NL/AS3LCR/Flash_10.0/flash/net/NetStreamInfo.html actionscript]) broadcasts the following QOS ''metrics'' for both RTMP dynamic and HTTP adaptive:
'''JW Player''' (using [http://help.adobe.com/nl_NL/AS3LCR/Flash_10.0/flash/net/NetStreamInfo.html actionscript]) broadcasts the following QOS ''metrics'' for both RTMP dynamic and HTTP adaptive:


* bandwidth: server-client data rate, in kilobytespersecond.
* bandwidth: server-client data rate, in kilobytespersecond.
Line 44: Line 46:




Previously the following statistics have been proposed for [http://wiki.whatwg.org/wiki/Adaptive_Streaming#QOS_Metrics HTTP adaptive streaming]:
For '''HTTP adaptive streaming''' the [http://wiki.whatwg.org/wiki/Adaptive_Streaming#QOS_Metrics following statistics] have been proposed:


* downloadRate: The current server-client bandwidth (read-only).
* downloadRate: The current server-client bandwidth (read-only).
Line 56: Line 58:




Further, a requirement to expose playback rate statistics has come out of [http://lists.w3.org/Archives/Public/public-html/2011Feb/0113.html issue-147]:
Further, a requirement to '''expose playback rate statistic'''s has come out of [http://lists.w3.org/Archives/Public/public-html/2011Feb/0113.html issue-147]:


* currentPlaybackRate: the rate at which the video/audio is currently playing back
* currentPlaybackRate: the rate at which the video/audio is currently playing back




[http://www.quora.com/Video-Analytics/What-are-the-most-valuable-metrics-in-online-video-analytics Here are a few metrics that measure the QoS] that a user receives:
[http://www.quora.com/Video-Analytics/What-are-the-most-valuable-metrics-in-online-video-analytics Here are a few metrics] that '''measure the QoS''' that a user receives:


* playerLoadTime
* playerLoadTime
* streamBitrate
* streamBitrate
(user interaction and playthrough can be measured using existing events)
(user interaction and playthrough can be measured using existing events)

Revision as of 21:02, 31 March 2011

Related HTML WG bug: http://www.w3.org/Bugs/Public/show_bug.cgi?id=12399


Requirements

For several reasons, we need to expose the performance of media elements to JavaScript.

One concrete use case is that content publishers want to understand the quality of their content as being played back by their users and how much a user is actually playing back. For example, if a video always goes into buffering mode after 1 min for all users - maybe there is a problem in the encoding, or the video is too big for the typical bandwidth/CPU combination. Also, publishers want to track the metrics of how much of their video and audio files is actually being watched.

A further use case is HTTP adaptive streaming, where an author wants to manually implement an algorithm for switching between different resources of different bandwidth or screen size. For example, if the user goes full screen and the user's machine and bandwidth allow for it, the author might want to switch to a higher resolution video.


Collection of Proposals/Implementations

Mozilla have implemented the following statistics into Firefox:

  • mozParsedFrames - number of frames that have been demuxed and extracted out of the media.
  • mozDecodedFrames - number of frames that have been decoded - converted into YCbCr.
  • mozPresentedFrames - number of frames that have been presented to the rendering pipeline for rendering - were "set as the current image".
  • mozPaintedFrames - number of frames which were presented to the rendering pipeline and ended up being painted on the screen. Note that if the video is not on screen (e.g. in another tab or scrolled off screen), this counter will not increase.
  • mozPaintDelay - the time delay between presenting the last frame and it being painted on screen (approximately).


Webkit have implemented these:

  • webkitAudioBytesDecoded - number of audio bytes that have been decoded.
  • webkitVideoBytesDecoded - number of video bytes that have been decoded.
  • webkitDecodedFrames - number of frames that have been demuxed and extracted out of the media.
  • webkitDroppedFrames - number of frames that were decoded but not displayed due to performance issues.


JW Player (using actionscript) broadcasts the following QOS metrics for both RTMP dynamic and HTTP adaptive:

  • bandwidth: server-client data rate, in kilobytespersecond.
  • latency: client-server-client roundtrip time, in milliseconds.
  • frameDropRate: number of frames not presented to the viewer, in frames per second.
  • screenWidth / screenHeight: dimensions of the video viewport, in pixels. Changes e.g. when the viewer jumps fullscreen.
  • qualityLevel: index of the currently playing quality level (see below).

Bandwidth and droprate are running metrics (averaged out). Latency and dimensions are sampled (taken once). For RTMP dynamic, the metrics are broadcast at a settable interval (default 2s). For HTTP adaptive, metrics are calculated and broadcast upon completion of a fragment load.

Separately, JW Player broadcasts a SWITCH event at the painting of a frame that has a different qualityLevel than the preceding frame(s). While the metrics.qualityLevel tells developers the qualityLevel of the currently downloading buffer/fragment, the SWITCH event tells developers the exact point in time where the viewer experiences a jump in video quality. This event also helps developers correlate the value of frameDropRate to the currently playing qualityLevel (as opposed to the currently loading one). Depending upon buffer, fragment and GOP size, the time delta between a change in metrics.qualityLevel and SWITCH.qualityLevel may vary from a few seconds to a few minutes.

Finally, JW Player accepts and exposes per video an array with quality levels (the distinct streams of a video between which the player can switch). For each quality level, properties like bitrate, framerate, height and width are available. The plain mapping using qualityLevel works b/c JW Player to date solely supports single A/V muxed dynamic/adaptive videos - no multi track.


For HTTP adaptive streaming the following statistics have been proposed:

  • downloadRate: The current server-client bandwidth (read-only).
  • videoBitrate: The current video bitrate (read-only).
  • droppedFrames: The total number of frames dropped for this playback session (read-only).
  • decodedFrames: The total number of frames decoded for this playback session (read-only).
  • height: The current height of the video element (already exists).
  • videoHeight: The current height of the videofile (already exists).
  • width: The current width of the video element (already exists).
  • videoWidth: The current width of the videofile (already exists).


Further, a requirement to expose playback rate statistics has come out of issue-147:

  • currentPlaybackRate: the rate at which the video/audio is currently playing back


Here are a few metrics that measure the QoS that a user receives:

  • playerLoadTime
  • streamBitrate

(user interaction and playthrough can be measured using existing events)