Search code, repositories, users, issues, pull requests...

Provide feedback.

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly.

To see all available qualifiers, see our documentation .

  • Notifications

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement . We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Start in Period results in negative time #74

@kirkshoop

carbon198 commented Sep 27, 2013

Carbon198 commented sep 30, 2013.

Sorry, something went wrong.

@kirkshoop

kirkshoop commented Sep 30, 2013

@bbert

bbert commented Oct 1, 2013

Carbon198 commented oct 2, 2013.

@carbon198

dtvltim commented Oct 15, 2013

@ghost

No branches or pull requests

@carbon198

  • Documentation
  • Unified Origin - LIVE
  • Understanding and configuring a Dynamic MPD

Understanding and configuring a Dynamic MPD 

This page explains the most relevant attributes in dynamic MPD (MPEG_DASH). Quite some of these attributes are unique to Live streaming scenarios and won't be present in a static MPD. None of them have to be configured (Origin will take care of this automatically), but some can be.

In general, we recommend against manually configuring any of the below options, and to rely on the default logic of Origin instead.

MPD @ type 

MPD @ type is set to 'dynamic', meaning that this is an MPD for a Live stream and that new segments will become available over time.

MPD @ profile 

By default the DASH 'ISO Media Live profile' is used (so the @profile is urn:mpeg:dash:profile:isoff-live:2011 ). For certain devices that require the DVB-DASH MPD profile, you may want to use --mpd.profile to specify this profile.

MPD @ availabilityStartTime 

MPD @ availabilityStartTime indicates the zero point on the MPD timeline of a dynamic MPD. It defaults to Unix epoch. The benefit of this becomes evident when redundancy is required: when both encoders are setup to use UTC timing they can start independent of each other, as the Unix epoch is always the same. Please see: Must Fix: use of UTC timestamps .

Offsetting availabilityStartTime 

Adding for instance 10 seconds will offset the player's Live edge with that amount. An example:

MPD @ publishTime 

MPD @ publishTime specifies the wall-clock time when the MPD was generated and served by Origin.

MPD @ minimumUpdatePeriod 

MPD @ minimumUpdatePeriod defaults to two seconds (aligned with the recommended GOP/segment size sent by the encoder). This specifies the smallest period between potential changes to the MPD, so matching the segment length is the correct default. However, if needed it can be changed with --mpd.minimum_update_period .

Once an MPD changes from dynamic to static MPD @ minimumUpdatePeriod is removed. Absence of the MPD @ minimumUpdatePeriod attribute indicates an infinite validity period (the MPD will not be updated anymore). A value of '0' indicates that the MPD has no validity after the moment it was retrieved. In such a situation, the client will have to acquire a new MPD whenever it wants to retrieve new media segments.

MPD @ timeShiftBufferDepth 

MPD @ timeShiftBufferDepth signals the length of the DVR window (i.e., how much a client can scrub back, to start playback at an earlier point in the stream). This length is specified with the --dvr_window_length .

MPD @ maxSegmentDuration 

MPD @ maxSegmentDuration is set to maximum segment duration within MPD.

MPD @ minimumBufferTime 

MPD @ minimumBufferTime prescribes how many seconds of buffer a client should keep to avoid stalling when streaming under ideal network conditions with bandwidth matching the @bandwidth attribute. This value can be changed with --mpd.min_buffer_time . It is set to 10 seconds by default.

MPD @ suggestedPresentationDelay 

MPD @ suggestedPresentationDelay by default is not present in the MPD (except when you use --mpd.profile to specify the DVB-DASH MPD profile). It is a suggested delay of the presentation compared to the Live edge. This option may be used to mitigate problems caused by a player that requests segments that do not exist yet (getting 404 errors in return). A reasonable value may be 2 to 4 times the segment duration, but not smaller than 4 seconds in order for the client to maintain some buffering. The delay can be defined with --mpd.suggested_presentation_delay .

@presentationTimeOffset 

@presentationTimeOffset is a key component in establishing the relationship between the MPD timeline and the actual media timeline, also referred to as the sample timeline. The @presentationTimeOffset is the media sample time corresponding to the Period @ start . Origin uses it in scenarios when a virtual subclip is requested from a Live archive, if that subclip has an end time in the past.

As the all of the content for the subclip will be available in the Live archive in such a case (assuming the start time is not to far in the past and still part of the archive), this will result in a VOD clip being served, of which the media timeline starts at zero.

In such scenarios the presentation time offset is calculated automatically to represent the time between the start of Unix epoch and the start of the media, so that the media can be addressed using the original timestamps used for the livestream.

Configuring presentation time offset 

Under certain circumstances, it may be useful to set this value manually. For example, an encoder may use a different clock than the DASH specified UTC clock, causing time shifts in the player. This can be mitigated with a @presentationTimeOffset in all SegmentTemplates for the livestream, which can be specified with --mpd.presentation_time_offset .

Presentation Time Offset

The presentationTimeOffset is an attribute which can be encountered in an MPD (the "manifest" of the DASH streaming technology).

Simply put, this attribute allows to correct an offset present in the media segments once those are decoded.

One of the possible usecase would be creating an on demand MPD from a subsection of an already-existing content, without modifying directly the concerned segments nor their (possibly time-based) URLs.

Another main usecase is when handling multi-Periods MPDs. Segments in newer Periods already need to consider an offset, corresponding to the start of the given Period. In those cases, the presentationTimeOffset might allows to "cancel" out this offset. This can be useful if the corresponding segments already define the right time.

  • Simple example

For example, let's imagine some on-demand content with a duration of 2 hours. To stay simple, this content begins at 00:00:00.000 and ends at 01:08:00.000 (1 hour and 8 minutes).

Now let's say that we want to create a new on-demand content, which is only a sub-part from this content. For example, we will take the subpart going from 00:05:24.000 to 00:12:54.000 (for a duration of 00:07:30.000 ).

Because we might not want to use money uselessly, we want to create this new content simply by creating a new MPD, and without touching the already created segments, nor their URLs.

In that condition, we will still need the client to know that this content actually have an offset of 00:05:24.000 . If it does not know that, we will just think that the content begins at a default 00:00:00.000 time.

Letting the client think that the content begins at the default 00:00:00.000 time could lead to several issues:

it might not be able to request the right first segments (as the URLs could be time-based)

even if it does, it might not be able to actually play the content, as we're pushing segments corresponding to a 00:05:24.000 while the browser is still waiting for the 00:00:00.000 ones (in that case, we would just have an infinite buffering state).

even if it does, the client timeline will announce a wrong time, offseted 5 minutes and 24 seconds too late.

This is where the presentationTimeOffset comes into play. In our simple example, this value will just announce an offset of 00:05:24.000 (under the form of an integer with a timescale to convert it into seconds), and the client will know what to do.

What the client has to do here is:

  • begin to play at 0 secods
  • ask the right segments, by adding this offset to the one it thinks it needs
  • remove the offset from the segment before decoding it

Time conversions

The presentationTimeOffset is linked to multiple other time attributes of an MPD, especially the start of the Period concerned, and of course the time of the segment.

We will enounce below a simple equation which put their relation into perspective.

To understand this equation, we will need to define some variables:

  • Easier conversion: the timestampOffset

As seen in the previous chapter, to convert the media time (time announced in the segments) into the presentation time (time that will be shown to the user), you will need to use both also include three other variables:

the start of the period

the presentationTimeOffset

the timescale used by the presentationTimeOffset and the media time

As a convenient plus, those three variables rarely change for a given period.

To simplify the conversion, we can thus define a new variable using those three. This is what the timestampOffset is all about.

Let's go back to the equations in the previous chapters, to isolate those three into the really simple equation: mediaTime/TS + timestampOffset = presentationTime (you can refer to the previous chapter to understand what those variables means)

With timestampOffset defined, it becomes easy to go back and forth between the mediaTime and the presentationTime :

As an added bonus, SourceBuffers defined in the HTML5 MediaSource Extentions also have a timestampOffset property , which means exactly the same thing as defined here!

  • In the RxPlayer

Now that we have all of those concepts out of the way, how are we going to use it, in the RxPlayer?

The RxPlayer has A LOT of time-related values defined for a given segment:

the time defined in the segment itself (mediaTime)

the time displayed when playing it in the HTMLMediaElement (presentationTime)

the time possibly set in the request (requestSegmentTime)

the time as announced in the corresponding attribute of the manifest (manifestTime)

the time used in the corresponding Segment Object in the RxPlayer (playerTime)

the time used in the buffered APIs of a HTMLMediaElement or SourceBuffer (bufferedTime)

As it turns out it's a lot simpler once you make two isolated groups:

the manifest group, which uses the non-offseted mediaTime .

In this group you have:

  • the mediaTime (duh)
  • the manifestTime
  • the requestSegmentTime

the real time group, which uses the offseted presentationTime .

  • the presentationTime
  • the playerTime
  • the bufferedTime

The manifest group is then only used in the transports code of the RxPlayer. Meanwhile, the real time group is used everywhere else.

It's actually the transports code that does most of the conversion for the rest of the code (removing the offset when requesting new segments, re-adding it once the segment is downloaded.

To be able to offset those segments in the SourceBuffer, those are still informed of course of the timestampOffset by the transports code. Then, this timestampOffset will be exploited only by the final decoding code.

  • Quick Start
  • Contents with DRM
  • Selecting a Track
  • EventStream handling
  • Track Preferences
  • Bitrate Selection
  • Constructor Options
  • loadVideo Options
  • Player Events
  • Player Errors
  • Player Methods
  • Player Types
  • Importing a minimal RxPlayer
  • Troubleshooting
  • Creating a Player
  • Feature switching
  • Loading a Content
  • Decryption Options
  • getPlayerState
  • addEventListener
  • removeEventListener
  • getPosition
  • getWallClockTime
  • getMinimumPosition
  • getMaximumPosition
  • getLivePosition
  • getMediaDuration
  • getVideoElement
  • getAvailablePeriods
  • getCurrentPeriod
  • Player States
  • getAudioTrack
  • getTextTrack
  • getVideoTrack
  • getAvailableAudioTracks
  • getAvailableTextTracks
  • getAvailableVideoTracks
  • setAudioTrack
  • setTextTrack
  • setVideoTrack
  • disableTextTrack
  • disableVideoTrack
  • isTrickModeEnabled
  • getVideoRepresentation
  • getAudioRepresentation
  • lockVideoRepresentations / lockAudioRepresentations
  • getLockedVideoRepresentations
  • getLockedAudioRepresentations
  • unlockVideoRepresentations / unlockAudioRepresentations
  • setPlaybackRate
  • getPlaybackRate
  • areTrickModeTracksEnabled
  • setWantedBufferAhead
  • getWantedBufferAhead
  • setMaxBufferBehind
  • getMaxBufferBehind
  • setMaxBufferAhead
  • getMaxBufferAhead
  • setMaxVideoBufferSize
  • getMaxVideoBufferSize
  • getContentUrls
  • updateContentUrls
  • getKeySystemConfiguration
  • getCurrentModeInformation
  • createDebugElement
  • getCurrentBufferGap
  • isContentLoaded
  • isBuffering
  • getLastStoredContentPosition
  • RxPlayer Static Properties
  • Typescript Types
  • TextTrackRenderer
  • VideoThumbnailLoader
  • StringUtils
  • parseBifThumbnails
  • MediaCapabilitiesProber
  • createMetaplaylist
  • Low Latency
  • DASH WASM Parser
  • Running in MultiThreading mode
  • Debug Element
  • Local Contents
  • Local Manifest v0.1
  • Text Tracks
  • MetaPlaylist
  • Initial Position
  • presentationTimeOffset
  • Differences between DASH' AdaptationSets and `Adaptation`
  • API Reference
  • File architecture

presentation time offset dash

Live stream with availabilityTimeOffset

Example showing how dash.js handles live streams with an availabilityTimeOffset(ATO). In this case the ATO is set to 10 seconds. Consequently media segments are available 10 seconds earlier compared to their usual availability start time. As a result, the buffer level will be up to 10 seconds higher than the live latency.

Wall Clock reference time

Debug information.

IMAGES

  1. Dash, Offset, Gap in After Effects

    presentation time offset dash

  2. Em Dash (—) vs En Dash (–): When to Use Dashes with Examples • 7ESL

    presentation time offset dash

  3. @presentationTimeOffset is the key component in establishing the

    presentation time offset dash

  4. Dash Offset

    presentation time offset dash

  5. Time offset results between a time reference and one board synchronized

    presentation time offset dash

  6. Dash Presentation, After Effects Project Files

    presentation time offset dash

VIDEO

  1. Why you want positive offset wheels

  2. Dash Full

  3. How To Avoid Speeding Ticket

  4. Speedtek by Galvatron in 1:17

  5. CÓMO USAR "OFFSET TRIGGER" en Geometry Dash [2.2]

  6. presentation time

COMMENTS

  1. DASH-IF implementation guidelines: restricted timing model

    In a dynamic presentation, the zero point of the MPD timeline is the mapped to the point in wall clock time indicated by the effective availability start time, which is formed by taking MPD@availabilityStartTime and applying any LeapSecondInformation offset (5.3.9.5 and 5.13).

  2. DASH-IF Implementation Guidelines: DASH events and timed ...

    Equation 2 Event Start Time of MPD event. In this section, we use the following common variable names instead of some of above variables to harmonize parameters between Inband events, MPD events, and timed metadata samples: scheme_id = EventStream@schemeIdUri. value = EventStream@value. presentation_time = ST.

  3. Presentation Time Offset (MLB) · Issue #77 · Dash-Industry ...

    Submitter: Scott Labrozzi. Email: [email protected]. Question 1: I recall that DASH refers to the Duration of Period [i] (when no duration is explicit) as being say PS [i+1] - PS [i]. While that is fine, it seems to me that the actual EXACT duration of any adaptation set within a period is actually: (PS [i+1] - PTO [i+1]) - (PS [i] - PTO ...

  4. DASH-IF implementation guidelines: the DASH timing model

    [MPEGDASH] specifies the structure of a DASH presentation, which consists primarily of: 4.1. Relationship to the previous versions of this document 4.2. Structure of a DASH presentation 1. The manifest or MPD, which describes the content and how it can be accessed. 2.

  5. DASH Player’s Application Events and Timed Metadata ...

    Any Event contained in this Event Stream is mapped to the Period timeline by using the Event presentation time adjusted by the value of the presentation time offset. The value of the presentation time offset in seconds is the division of the value of this attribute and the value of the @timescale attribute. Event. 0 ... N. specifies one event.

  6. video streaming - How to substitute $Time$ in DASH URL ...

    Below is a rather long introduction to the question, which is an analysis of the ISO/IEC norm 23009-1:2019 ("Information technology — Dynamic adaptive streaming over HTTP (DASH) — Part 1: Media

  7. Start in Period results in negative time · Issue #74 · Dash ...

    Hi, I am writing a DASH packager and I'm trying to set a presentation time offset. In version 0.2.5 of this player, using presentationTimeOffset worked for both segment list and segment template MPDs. Now in 1.0.0, the attribute seems to be ignored. In any case, I instead tried using the Period "start" attribute.

  8. Understanding and configuring a Dynamic MPD — Unified Streaming

    Under certain circumstances, it may be useful to set this value manually. For example, an encoder may use a different clock than the DASH specified UTC clock, causing time shifts in the player. This can be mitigated with a @presentationTimeOffset in all SegmentTemplates for the livestream, which can be specified with --mpd.presentation_time_offset.

  9. Presentation Time Offset - Canal+ Developers Hub

    Presentation Time Offset. The presentationTimeOffset is an attribute which can be encountered in an MPD (the "manifest" of the DASH streaming technology). Overview. Simply put, this attribute allows to correct an offset present in the media segments once those are decoded.

  10. Live stream with availabilityTimeOffset - DASH-IF

    Example showing how dash.js handles live streams with an availabilityTimeOffset (ATO). In this case the ATO is set to 10 seconds. Consequently media segments are available 10 seconds earlier compared to their usual availability start time. As a result, the buffer level will be up to 10 seconds higher than the live latency.