105.753 AKANA AKOR Konvexe und Tame Optimierung
Diese Lehrveranstaltung ist in allen zugeordneten Curricula Teil der STEOP.
Diese Lehrveranstaltung ist in mindestens einem zugeordneten Curriculum Teil der STEOP.

2022W, VO, 3.0h, 4.5EC

Merkmale

  • Semesterwochenstunden: 3.0
  • ECTS: 4.5
  • Typ: VO Vorlesung
  • Format der Abhaltung: Präsenz

Lernergebnisse

Nach positiver Absolvierung der Lehrveranstaltung sind Studierende in der Lage...

After successful completion of the course, the students will be familiar with the two main paradigms of Modern nonsmooth analysis, namely the convex and the semialgebraic paradigm. They will recognize the crucial role of these paradigms in obtaining fundamental theoretical results in Optimization and Operations Research, such as good prior estimates in numerical descent algorithms.  


Inhalt der Lehrveranstaltung

Sehe englische Zusammenfassung

Contents

1. Convex and Nonsmooth Analysis 

From smooth manifolds to tangent and normal cones

Convex functions and subdifferentials 

Lipschitz functions, Clarke subdifferential

 

2. Tame variational analysis

Semialgebraic functions, o-minimal structures

Stratification vs Clarke subdiferential

Sard theorem for nonsmooth tame functions.

Lojasiewicz inequality and generalizations

 

3. Asymptotic analysis of descent systems

Proximal algorithm – steepest descent

Kurdyka’s desigularization: characterization and applications

A convex counterexample to Kurdyka’s desigularization

Asymptotic equivalence between continuous and discrete systems

Self-contracted curves, Manselli-Pucci mean width technique

Methoden

These lectures present the main theory of convex analysis and Tame variational analysis from the optimization viewpoint. A particular emphasis will be given in descent methods. Research topics will also be discussed. The lectures will be given in English, unless the participants decide otherwise.

The course will be supported by partial lecture notes and research papers. 


Prüfungsmodus

Mündlich

Weitere Informationen

Nonsmoothness prevails optimization in most of its theoretical and practical aspects. Even if the starting point is a smooth (or even a polynomial) model, natural operations like marginal/value functions, min/max selections etc destroy smoothness. In addition, extrema of nonsmooth functions occur, in general, at points of nondifferentiability. This has inevitably led to the development of the modern variational analysis and of nonsmooth optimization algorithms. Since the seminal example of Weierstrauss, back to 1872, exhibiting a univariate continuous real-valued function which is nowhere differentiable, it has been commonly understood that pathologies are tightly linked with almost all theories in classical analysis. Variational Analysis, handling nonsmooth objects cannot be an exception. Notwithstanding, in most applications nonsmoothness arises together with an intrinsic structure: for instance, an initial polynomial model will give rise to a semialgebraic structure. Therefore, although a general nonsmooth theory will be full of pathological situations, it is founded to consider the trace of this theory within well-behaved paradigms. This course aims at underlining the use of these paradigms in optimization, focusing on minimization algorithms or general descent systems. After an introductory crash course in Nonsmooth Analysis, we shall consider

Nonsmooth Optimization problems enjoying a nice intrinsic structure: The Tame paradigm —which is what is nowadays called Tame Optimization encompassing the semialgebraic structures— and the (classical) Convex paradigm. Convergence analysis of the proximal algorithm —a central tool in nonsmooth minimization— will be presented in the light of these two paradigms, emphasizing important convergence properties. So far, some of these properties seem to be eluded even in the convex case. Relations between continuous vs discrete dynamical descent systems will be presented. A secondary aim of this course is to provide essential background and material for further research. During the lectures, some open problems will be eventually mentioned.

 

Vortragende Personen

Institut

LVA Termine

TagZeitDatumOrtBeschreibung
Mi.15:00 - 18:0012.10.2022 - 25.01.2023Sem.R. DB gelb 09 Convex and Tame Optimisation
Mo.10:00 - 13:0024.10.2022Sem.R. DB gelb 09 Convex and Tame Optimization
Do.15:00 - 18:0003.11.2022Sem.R. DA grün 06B Convex and Tame Optimization
Mi.14:00 - 15:0009.11.2022Sem.R. DB gelb 09 Convex and Tame Optimization
AKANA AKOR Konvexe und Tame Optimierung - Einzeltermine
TagDatumZeitOrtBeschreibung
Mi.12.10.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Mi.19.10.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Mo.24.10.202210:00 - 13:00Sem.R. DB gelb 09 Convex and Tame Optimization
Do.03.11.202215:00 - 18:00Sem.R. DA grün 06B Convex and Tame Optimization
Mi.09.11.202214:00 - 15:00Sem.R. DB gelb 09 Convex and Tame Optimization
Mi.09.11.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Mi.16.11.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Mi.30.11.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Mi.07.12.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Mi.14.12.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Mi.21.12.202215:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Mi.11.01.202315:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Mi.18.01.202315:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation
Mi.25.01.202315:00 - 18:00Sem.R. DB gelb 09 Convex and Tame Optimisation

Leistungsnachweis

Oral presentations and oral exam

LVA-Anmeldung

Nicht erforderlich

Zulassungsbedingung

Voraussetzung für die Anmeldung ist eine Fortmeldung zu einem der folgenden Studien:

Curricula

StudienkennzahlSemesterAnm.Bed.Info
860 GW Gebundene Wahlfächer - Technische Mathematik

Literatur

Es wird kein Skriptum zur Lehrveranstaltung angeboten.

Vorkenntnisse

Students with a good background in Analysis/Optimization, motivated by modern trends in Mathematics, are encouraged to take this course. 

Sprache

Englisch