HTML5 Media Synchronisation

This seems to be going backwards. Especially with Mozilla retiring PopcornJS. Here are a bunch of links:

* HTML5 video -] CrossoverLabs review - * Audio Scheduling - * Audio and video events - * Timed audio events - * Timed text-tracks - * Popcorn-js - github * Web Audio API - * Wikipedia Commons timed text - wikipedia * Well supporter (btu no events) -

# Track and cue events

The ability to use structured data in cues makes the track element extremely powerful and flexible. A web app can listen for cue events, extract the text of each cue as it fires, parse the data and then use the results to make DOM changes (or perform other JavaScript or CSS tasks) synchronised with media playback. This technique is used to synchronise video playback and map marker position in the demo at []

* Synchronised video * HTML5-Video-with-WebVTT-Chapters -

The two types of cue event are: * enter and exit events fired for cues * cuechange events fired for tracks.

In the previous example, cue event listeners could have been added like this:

cue.onenter = function(){ // do something }; cue.onexit = function(){ // do something else };

Be aware that the enter and exit events are only fired when cues are entered or exited via playback. If the user drags the timeline slider manually, a cuechange event will be fired for the track at the new time, but enter and exits events will not be fired. It's possible to get around this by listening for the cuechange track event, then getting the active cues. (Note that there may be more than one active cue.)

The following example gets the current cue, when the cue changes, and attempts to create an object by parsing the cue text:

textTrack.oncuechange = function (){ // "this" is a textTrack var cue = this.activeCues[0]; // assuming there is only one active cue var obj = JSON.parse(cue.text); // do something }

# Not just for video

Don't forget that tracks can be used with audio as well as video‚Äďand that you don't need audio, video or track elements in HTML markup to take advantage of their APIs. The TextTrack API documentation has a nice example of this, showing a neat way to implement audio 'sprites':

var sfx = new Audio('sfx.wav'); var track = sfx.addTextTrack('metadata'); // previously implemented as addTrack() // Add cues for sounds we care about. track.addCue(new TextTrackCue(12.783, 13.612, 'dog bark')); // startTime, endTime, text track.addCue(new TextTrackCue(13.612, 15.091, 'kitten mew')); function playSound(id) { sfx.currentTime = track.getCueById(id).startTime;; } playSound('dog bark'); playSound('kitten mew');

You can see a more elaborate example in action at []

# Video Events * Sam Dutton -