intelligent Metadata driven Processing and distribution of audiovisual media
iMP will create architecture, workflow and applications for intelligent metadata-driven processing and distribution of digital movies and entertainment. The goal is to enable a 'Virtual Film Factory' in which creative professional...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
iMP will create architecture, workflow and applications for intelligent metadata-driven processing and distribution of digital movies and entertainment. The goal is to enable a 'Virtual Film Factory' in which creative professionals can work together to create and customise programmes from Petabyte-scale digital repositories, using semantic technologies to organise data and drive its processing. By separating metadata from essence, controlling all the image and sound processing operations from the metadata layer, we can maintain the underlying data library unchanged while enabling a new generation of more flexible applications. This will radically reduce the amount of data created: new versions, grades, or language releases only result in additional metadata, not new data files. The system will support a more automated workflow for content distribution from postproduction to the assembly, distribution and playout of multiple variations of programmes in different formats and locations. Outcomes will be:\tAn infrastructure in which multi-Petabye data stores are managed by persistent metadata in a distributed metadatabase.\tReal-time interaction with media sequences selected from the data store. When a sequence is changed, by an application, the commands are stored as a new set of metadata. The data remains unchanged in the store, but viewing and listening sequences are rendered 'on the fly. Semantic instruction sets to define associations between sequences of essence, the processes applied, and the uses to which they are put. Integration of currently separate processes (such as grading, CGI, audio effects and editing, version creation, previewing and viewing, customisation, mastering, distribution) in a 'Virtual Film Factory.' Changes to the video space will transfer to changes in a three-dimensional audio representation. Automated adjustment of video and audio to the physical characteristics, acoustics or screen size of the viewing environment.