Scalable Autonomic Streaming Middleware for Real time Processing of Massive Data...
Scalable Autonomic Streaming Middleware for Real time Processing of Massive Data Flows
A growing number of applications require the ability to analyze massive amounts of streaming data in real time. Examples of such applications are: market data processing, anti-spam and anti-virus filters for e-mail, network securi...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Proyectos interesantes
321526
Elastic and transparent scaling for stream processing applic...
100K€
Cerrado
JUNIPER
Java platform for high performance and real time large scale...
5M€
Cerrado
EXTRACT
A distributed data-mining software platform for extreme data...
5M€
Cerrado
CLASS
Edge and CLoud Computation A Highly Distributed Software Ar...
4M€
Cerrado
INDIGO-DataCloud
INtegrating Distributed data Infrastructures for Global Expl...
12M€
Cerrado
ScaleML
Elastic Coordination for Scalable Machine Learning
1M€
Cerrado
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
A growing number of applications require the ability to analyze massive amounts of streaming data in real time. Examples of such applications are: market data processing, anti-spam and anti-virus filters for e-mail, network security systems for incoming IP traffic in organisation-wide networks, automatic trading, fraud detection for cellular telephony to analyze and correlate phone calls, fraud detection for credit cards, and e-services for verifying SLAs. Typically, such applications require strong analysis and processing capabilities, i.e., data mining, to discover facts of interest. Data analysis happens today on clusters of workstations using specialized middleware and applications. Although solutions for real-time processing of information flows already exist, current platforms and infrastructures phase three main limitations: (a) scalability, (b) autonomy, and (c) performance. STREAM aims at scaling system size by an order of magnitude, to 100s of nodes, achieving real-time processing of information flows, and providing unsupervised and autonomous operation. This will allow for much broader deployment of such products and services to new areas that need to manipulate large information flows in a cost-effective manner, and in particular, the Telecom, Financial, and E-services sectors.