Scaling Methods for Discrete and Continuous Optimization
One of the most important open questions in optimization is to find a strongly polynomial algorithm for linear programming. The proposed project aims to tackle this problem by combining novel techniques from two different domains:...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Proyectos interesantes
AFMIDMOA
Applying Fundamental Mathematics in Discrete Mathematics O...
2M€
Cerrado
POEMA
Polynomial Optimization Efficiency through Moments and Alge...
4M€
Cerrado
PID2020-114594GB-C21
OPTIMIZATION ON DATA SCIENCE AND NETWORK DESIGN PROBLEMS: LA...
210K€
Cerrado
QIP
Towards a Quantitative Theory of Integer Programming
2M€
Cerrado
PowAlgDO
Power of Algorithms in Discrete Optimisation
1M€
Cerrado
MTM2008-03032
LOCALIZACION CONTINUA. NUEVAS APLICACIONES, MODELOS Y ALGORI...
19K€
Cerrado
Información proyecto ScaleOpt
Duración del proyecto: 69 meses
Fecha Inicio: 2017-09-15
Fecha Fin: 2023-06-30
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
One of the most important open questions in optimization is to find a strongly polynomial algorithm for linear programming. The proposed project aims to tackle this problem by combining novel techniques from two different domains: discrete optimization and continuous optimization. We expect to contribute to exciting recent developments on the interface of these two fields.
We use and develop new variants of the classical scaling technique. From the discrete optimization side, recent work of the PI on generalized flows extends classical network flow theory and opens up new domains for strongly polynomial computability beyond integer constraint matrices. We will apply this novel scaling technique to obtain strongly polynomial algorithms for broad classes of linear programs.
From the continuous optimization side, we aim to build the theory of geometric rescaling algorithms for linear and convex optimization. This approach combines first-order methods with geometric rescaling techniques to obtain a new family of polynomial-time algorithms. We expect to devise variants efficient in theory and in practice, which we will use in a wide range of applications.
Our discrete and continuous techniques will have important applications in submodular function minimization. We will develop new, efficient algorithms for the general problem as well as for specific applications in areas such as machine learning and computer vision.
In summary, the project will develop novel approaches for some of the most fundamental optimization problems. It will change the landscape of strongly polynomial computability, and make substantial progress towards finding a strongly polynomial algorithm for linear programming.