Plenary Speakers

Alper Yildirim (School of Mathematics, The University of Edinburgh)

(Short Bio)

Title: Convex Relaxations of Nonconvex Quadratic Programs: A New Perspective via Convex Underestimators

Abstract: Quadratic programming is concerned with minimizing a (nonconvex) quadratic function over a polyhedron. In addition to numerous applications, quadratic programs arise as subproblems in many algorithms for general nonlinear optimization problems.  It is therefore a fundamental NP-hard problem in optimization. (More)

 


Dolores Romero Morales (Copenhagen Business School)

(Short Bio)

Title: Transparent Machine Learning Calls For More Optimization

Abstract: While the complexity of data and models is continuously increasing, the use of machine learning in high-stakes decisions calls for more transparency and accountability. In this presentation, we will navigate through some novel mathematical optimization models that embed explainability and fairness in the construction of Data Science models. This includes the ability to provide global, local and counterfactual explanations, as well as model cost- sensitivity and fairness requirements. We will show the versatility of our methodology when applied to more complex data types such as functional data. 


Marc Teboulle (School of Mathematical Sciences, Tel Aviv University)

(Short Bio)

Title: Algorithms for Structured Nonconvex Optimization

Abstract: In recent years, nonconvex optimization models have attracted a revived interest among scientists working in various disparate modern applications such as signal and image processing, and in machine learning. Indeed, in such applications, the models are often genuinely nonconvex and nonsmooth, yet they are very hard to solve. This talk will review some recent developments on the design and analysis of simple algorithms for some important classes of nonconvex problems, highlighting the pillars of the convergence theory, and the exploitation of the problem's structure and data information.                            


Oliver Stein (Karlsruher Institut für Tecnologie)

(Short Bio)

Title: Branch-and-bound for continuous and mixed-integer multiobjective optimization

Abstract: This talk explains a recently developed general framework for branch-and-bound methods in multiobjective optimization. It may be applied to continuous as well as to mixed-integer convex and nonconvex multiobjective problems and actually yields the first deterministic method for the mixed-integer nonconvex case.

After providing summaries of some main ideas in branch-and-bound and in multiobjective optimization, the talk focuses on natural generalizations of notions and techniques from the single objective to the multiobjective case, such as upper and lower bounds, discarding tests, node selection and, most importantly, a gap-based termination criterion. As a central tool we discuss convergent enclosures for the set of nondominated points and their limiting behavior. Numerical results for two and three objective functions illustrate this approach.