Storisphere

Cloud-based community video editing


Description

This is just a record of some software which isn't actually available yet. I think the intention was to make it open-source, but it just never happened. Many of the links are broken because the system is no longer up and running. Welcome to Disappointmentville!

I have a neat, unpublished document on ‘interval mathematics’ that explains some of the computations involved. I will see if I can put it out at some point.

Storisphere is an on-line video editing system to support community storytelling, i.e., users with a common interest, but without necessarily a personal connection with each other, share their raw video clips through Storisphere, which then allows them to edit pieces of those clips together to form new stories, with a semi-professional touch.

Editing innovation

Storisphere maintains a repository of uploaded raw videos (rushes) and composite videos. All composites are held as time-coded reference to other videos, which may themselves be either rushes or other composites. Composites, therefore, are just text files, and take up little extra space.

Ultimately, any composite video that references other composites can be resolved into an equivalent composite that only references rushes directly. When a user wishes to see any portion of a composite or a rush, the server performs this resolution so that it can build a playable video on-the-fly from the parts of the rushes that the video is ultimately composed of (the conformation process). Rather than fetching a video to the user's device, editing it locally, and retransmitting the result back to the server, the user now only needs to send editing instructions to the server. The server applies these instructions, and can immediately generate a new version of the video on demand, without having to transcode the video again. Apart from the initial upload of rushes, this orients the greater volume of communication mostly downstream, in line with asymmetric DSL that is most often available to users.

The generated video is delivered as independently decodable chunks formed from the original rushes, allowing a user in the process of editing to exploit his own cache of chunks for portions of rushes he has already watched. This means that small changes to a story can be viewed immediately without excessive network use, because the previous preview typically has already caused most of the chunks needed by the new preview to be fetched and cached.

Furthermore, each new rush is encoded at several quality levels. The choice of quality used for playback need not be decided until playback is about to start, i.e., the editing is done independently of quality level. One user can be editing a video at a low quality down his 1Mbps ASDL, while another simultaneously reviews the latest changes at high quality on a 10Mbps link at work.

Amateur/​professional bridge

Functionally, Storisphere does not distinguish between amateur and professional footage. A composite may be formed from both kinds, even if they are provided with different maximum resolutions and aspect ratios.

As well as user uploads, Storisphere can accept mechanically delivered content from (say) an IPTV service. This allows, for example, football spectators to combine their own footage taken in the terraces with professional TV broadcasts taken at superior vantage points. However, all rush clips offered to a user for editing are clearly labelled as ‘user-generated’ or ‘featured’ to keep the user aware of potential copyright issues.

Meta-data

Storisphere maintains time-coded meta-data on all content. That is, additional information can be attached to any period of time of any composite or rush, primarily to assist searching. This includes basic textual information, such as titles, descriptions, subtitles and user comments. It allows the location, altitude, attitude and real time of a recording to be retained to support time- and location-based searching. It also allows the annotation of shots and scenes, which can help to improve search results.

Architecture

Storisphere has three main parts:

MARS

The Media Asset Referencing System defines an EDL format based on XML. EDLs are used to describe composites.

MARS then defines two functions that can be applied to EDLs. One is resolution, in which an EDL that references other EDLs is converted to one that directly references what those other EDLs reference. The other function is conformation, in which an EDL that only references rushes directly is converted into an MPEG-4 movie.

The generated movies use features of MPEG-4 that are not often implemented. One of those is the ability to reference raw data external to the MPEG-4 file itself. Because so few players implement this feature, and those that do so do it badly, a third function—kerbside stitching—is defined to convert the MPEG-4s from MARS into integrated files which are better supported by players.

MARS is written primarily in Java.

Mediaplex

Mediaplex maintains meta-data for all content that MARS serves, and also prepares (i.e., ingests) all new content for use by MARS. When content comes from professional broadcasts, Mediaplex records programme titles, synopses and subtitles among the associated meta-data. User content may include location information recorded on an Android 'phone, and Mediaplex maintains this as meta-data too. All content is analysed for shot changes, which are also recorded as meta-data.

Mediaplex components are written in a variety of languages, including Java, BASH, MATLAB, PHP…

Storiboard

This is the Web-based user interface, written in HTML5, JavaScript and PHP. It allows users to manage groups, upload rushes, edit stories and search for content.

Contributors

I mainly wrote the MARS EDL resolution and conformation system, along with the tools to prepare data for use by MARS. The following people also coded on this project:

Let's not forget the academics:

Acknowledgements

Storisphere is one of the major outputs of SCC's activity in the EPSRC FIRM project (grant number EP/H003738/1), which ended on 2013-06-30. Work on Storisphere was continued in EU FP7 STEER until 2014-11-30.

Storisphere is a development of ONE, the Open Narratives Ecology (or Environment), conceived by colleagues from BBC Research and Development at MediaCityUK (Michael Sparks and Adrian Woolard), subsequently taken forward by Dr. Adam Lindsay and others at Lancaster University, all of whom we greatly acknowledge for their contributions.


Files

File Size Last modified Description Requirements and recommendations
Source (SVN) GNU Make Java 1.7 Jardeps RJM Linux
Source documentation