Nach positiver Absolvierung der Lehrveranstaltung sind Studierende in der Lage...
After successful completion of the course, the students will be familiar with the two main paradigms of Modern nonsmooth analysis, namely the convex and the semialgebraic paradigm. They will recognize the crucial role of these paradigms in obtaining fundamental theoretical results in Optimization and Operations Research, such as good prior estimates in numerical descent algorithms.
Sehe englische Zusammenfassung
Contents.
1. Convex and Nonsmooth Analysis
From smooth manifolds to tangent and normal cones
Convex functions and subdifferentials
Lipschitz functions, Clarke subdifferential
2. Tame variational analysis
Semialgebraic functions, o-minimal structures
Stratification vs Clarke subdiferential
Sard theorem for nonsmooth tame functions.
Lojasiewicz inequality and generalizations
3. Asymptotic analysis of descent systems
Proximal algorithm – steepest descent
Kurdyka’s desigularization: characterization and applications
A convex counterexample to Kurdyka’s desigularization
Asymptotic equivalence between continuous and discrete systems
Self-contracted curves, Manselli-Pucci mean width technique
These lectures present the main theory of convex analysis and Tame variational analysis from the optimization viewpoint. A particular emphasis will be given in descent methods. Research topics will also be discussed. The lectures will be given in English, unless the participants decide otherwise.
The course will be supported by partial lecture notes and research papers.
Nonsmoothness prevails optimization in most of its theoretical and practical aspects. Even if the starting point is a smooth (or even a polynomial) model, natural operations like marginal/value functions, min/max selections etc destroy smoothness. In addition, extrema of nonsmooth functions occur, in general, at points of nondifferentiability. This has inevitably led to the development of the modern variational analysis and of nonsmooth optimization algorithms. Since the seminal example of Weierstrauss, back to 1872, exhibiting a univariate continuous real-valued function which is nowhere differentiable, it has been commonly understood that pathologies are tightly linked with almost all theories in classical analysis. Variational Analysis, handling nonsmooth objects cannot be an exception. Notwithstanding, in most applications nonsmoothness arises together with an intrinsic structure: for instance, an initial polynomial model will give rise to a semialgebraic structure. Therefore, although a general nonsmooth theory will be full of pathological situations, it is founded to consider the trace of this theory within well-behaved paradigms. This course aims at underlining the use of these paradigms in optimization, focusing on minimization algorithms or general descent systems. After an introductory crash course in Nonsmooth Analysis, we shall consider
Nonsmooth Optimization problems enjoying a nice intrinsic structure: The Tame paradigm —which is what is nowadays called Tame Optimization encompassing the semialgebraic structures— and the (classical) Convex paradigm. Convergence analysis of the proximal algorithm —a central tool in nonsmooth minimization— will be presented in the light of these two paradigms, emphasizing important convergence properties. So far, some of these properties seem to be eluded even in the convex case. Relations between continuous vs discrete dynamical descent systems will be presented. A secondary aim of this course is to provide essential background and material for further research. During the lectures, some open problems will be eventually mentioned.
Voraussetzung für die Anmeldung ist eine Fortmeldung zu einem der folgenden Studien:
Students with a good background in Analysis/Optimization, motivated by modern trends in Mathematics, are encouraged to take this course.