Chaos, artificial intelligence, order - all faces of applied mathematics

On Thursday, October 6, 2022, a teacher training course on the above topic will take place at the Institute of Mathematics at Clausthal University of Technology, Erzstraße 1, from 9:30 a.m. to 4:15 p.m., in cooperation with the Competence Center for Teacher Training Braunschweig (KLBS), to which we cordially invite you. The cost of participation is 25 euros per participant and will be charged via the KLBS.

Registration is open until 6.9.2022 at http://vedab.nibis.de possible. Direct link:

https://vedab.de/veranstaltungsdetails.php?vid=132827

Contents

In the training course, three speakers will give talks on current topics in mathematics and put them up for discussion:

Prof. Dr. W. Herget, University of Halle-Wittenberg
Mathematics has many faces ... applied, averted and applied

... applied: Learning mathematics { what's the point? One answer to this is application and reality-oriented mathematics teaching. It shows: Mathematics is useful.
... turned away: But math can also just be "nice". Good for nothing. Simply beautiful. This page also belongs in a general education math lesson. I present a series of surprisingly simple, vivid and tangible examples. And in addition to applied and averted, something third becomes clear, namely turned towards: In order to bring "my" mathematics closer to the students, I have to turn to them - honestly, transparently, clearly, reliably.

Prof. Dr. A. Potschka
Chaos to order: fractals and Newton's method

Methods of mathematical modelling, simulation and optimization often lead to non-linear systems of equations that can be solved using Newton's method. Newton's method is an iterative procedure in which a rough estimate of a solution is improved step by step. In this lecture it will be shown that small changes in the initial estimate can lead to significantly different results of the Newton method: Fractal structures appear, which are aesthetically very pleasing to look at, but can lead to undesired effects in practice. These fractal structures can be overcome if the step sizes of the method are suitably reduced. For this purpose, the Newton method with infinitesimally small step sizes can be used conceptually, which leads to a continuous Newton flow instead of a discrete Newton iteration. This ensures that the result of the Newton method is "closest" (in the sense of the Newton flow) to the initial estimate.

Prof. Dr. B. Säfken
Opening a black box - How does deep learning work?

Deep learning has helped artificial intelligence achieve a breakthrough in recent years with astonishing advances. AI processes are now used in a wide range of applications and products in our daily lives. Whether autonomous driving, smartphones, recommender systems or bank transfers, neural networks are used everywhere. They are particularly powerful when it comes to analyzing more complex data types such as images or text documents. In this lecture, we not only want to take a look at the rapid development of recent years, but also shed light on the dark side of AI, e.g. so-called deep fakes. In order to be able to assess the promise of AI developments as well as to take a critical look at the technological development, a basic understanding of how it works is necessary. From a mathematical point of view, AI is surprisingly simple. All it takes is some matrix multiplication, the chain rule and the optimization of a function. Nevertheless, AI often remains a black box. However, modern research approaches can shed some light on the subject.

Program

9.30 - 9.45Welcome
9.45 - 10.30Mathematics has many faces ... applied, averted and applied (Part 1), Prof. Dr. W. Herget
10.30 - 11.00Coffee break
11.00 - 11.45Mathematics has many faces ... applied, averted and applied (part 2), Prof. Dr. W. Herget
11.45 - 13.15Lunchtime
13.15 - 14.15Chaos to order: fractals and Newton's method, Prof. Dr. A. Potschka
14.15 - 14.45Coffee break
14.45 - 15.45Opening a black box - How does deep learning work? Prof. Dr. B. Säfken
15.45 - 16.15Discussion and closing remarks