Nonsmooth optimization is devoted to the general problem of minimizing functions that are typically not differentiable at their minimizers. In order to optimize nonsmooth functions, the classical theory of optimization cannot be directly used due to lacking certain differentiability and strong regularity conditions. However, because of the complexity of the real world, functions used in practical applications are often nonsmooth. Nonsmooth problems appear in many fields of applications, for example in image denoising, optimal control, neural network training, data mining, economics and computational chemistry and physics, therefore it is of eminent importance to be able to optimize a nonsmooth function. This course aims to provide a basic working understanding of nonsmooth analysis and its applications in optimization. Intended topics include:
Prerequisites: analysis and linear algebra (Bachelor level), smooth optimization methods (Steepest descent methods, Newton methods etc.).
|Date & time:||Tuesday and Thursday||14:15–15:45 Uhr,||Wegelerstr. 6, Room 6.020|