EXtreme scale Analytics via Multimodal Ontology Discovery Enhancement
Exascale volumes of diverse data from distributed sources are continuously produced. Healthcare data stand out in the size produced (production 2020 >2000 exabytes), heterogeneity (many media, acquisition methods), included knowle...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Proyectos interesantes
RED2018-102641-T
BIG DATA Y CIENCIA DE DATOS: RETOS EN LA APLICACION DE LA IN...
15K€
Cerrado
MAESTRA
Learning from Massive Incompletely annotated and Structure...
2M€
Cerrado
K-DRIVE
Knowledge Driven Data Exploitation
2M€
Cerrado
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
Exascale volumes of diverse data from distributed sources are continuously produced. Healthcare data stand out in the size produced (production 2020 >2000 exabytes), heterogeneity (many media, acquisition methods), included knowledge (e.g. diagnostic reports) and commercial value. The supervised nature of deep learning models requires large labeled, annotated data, which precludes models to extract knowledge and value. EXA MODE solves this by allowing easy & fast, weakly supervised knowledge discovery of exascale heterogeneous data provided by the partners, limiting human interaction. Its objectives include the development and release of extreme analytic methods and tools, that are adopted in decision making by industry and hospitals. Deep learning naturally allows to build semantic representations of entities and relations in multimodal data. Knowledge discovery is performed via document-level semantic networks in text and the extraction of homogeneous features in heterogeneous images. The results are fused, aligned to medical ontologies, visualized and refined. Knowledge is then applied using a semantic middleware to compress, segment and classify images and it is exploited in decision support and semantic knowledge management prototypes. EXA MODE is relevant to ICT12 in several aspects: 1) Challenge: it extracts knowledge and value from heterogeneous quickly increasing data volumes. 2) Scope: the consortium develops and releases new methods and concepts for extreme scale analytics to accelerate deep analysis also via data compression, for precise predictions, support decision making and visualize multi-modal knowledge. 3) Impact: the multi-modal/media semantic middleware makes heterogeneous data management & analysis easier & faster, it improves architectures for complex distributed systems with better tools increasing speed of data throughput and access, as resulting from tests in extreme analysis by industry and in hospitals.