One of the most significant recent developments in applied machine learning has been the resurgence of ``deep learning'', usually in the form of artificial neural networks. The empirical success of deep learning is stunning, and...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Proyectos interesantes
IT-DNN
INFORMATION THEORETIC LIMITS FOR DEEP NEURAL NETWORKS
204K€
Cerrado
DEEPLEARNING
A biologically inspired algorithm for training deep neural n...
164K€
Cerrado
PID2021-125711OB-I00
COMBINANDO TECNICAS DE MODELIZACION Y APRENDIZAJE AUTOMATICO...
145K€
Cerrado
TIN2015-66951-C2-1-R
RECONOCIMIENTO VISUAL CON METODOLOGIAS DE APRENDIZAJE DE PRI...
91K€
Cerrado
TEC2015-68172-C2-1-P
REDES PROFUNDAS Y MODELOS DE SUBESPACIOS PARA DETECCION Y SE...
100K€
Cerrado
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
One of the most significant recent developments in applied machine learning has been the resurgence of ``deep learning'', usually in the form of artificial neural networks. The empirical success of deep learning is stunning, and deep learning based systems have already led to breakthroughs in computer vision and speech recognition. In contrast, from the theoretical point of view, by and large, we do not understand why deep learning is at all possible, since most state of
the art theoretical results show that deep learning is computationally hard.
Bridging this gap is a great challenge since it involves proficiency in several theoretic fields (algorithms, complexity, and statistics) and at the same time requires a good understanding of real world practical problems and the ability to conduct applied research. We believe that a good theory must lead to better practical algorithms. It should also broaden the applicability of learning in general, and deep learning in particular, to new domains. Such a practically relevant theory may also lead to a fundamental paradigm shift in the way we currently analyze the complexity of algorithms.
Previous works by the PI and his colleagues and students have provided novel ways to analyze the computational complexity of learning algorithms and understand the tradeoffs between data and computational time. In this proposal, in order to bridge the gap between theory and practice, I suggest a departure from worst-case analyses and the development of a more optimistic, data dependent, theory with ``grey'' components. Success will lead to a breakthrough in our understanding of learning at large with significant potential for impact on the field of machine learning and its applications.