–
Room P3.10, Mathematics Building
In his famous lecture in Paris 1900 Hilbert formulated 23 challenging mathematical problems which inspired many groundbraking mathematical investigations in the last century. Among these problems the 13th asked for nomographical solutions of certain function equations. Hilbert's own supposition was that for certain equations of higher degree no such solution can be constructed. Surprisingly, 57 years later it appeared that Hilbert was wrong when Kolmogorov suceeded to prove his Superposition Theorem which states that each multivariate continuous real-valued function can be represented as superposition and composition of one-dimensional continuous functions. Again 30 years later this theorem got an interesting application in the theory of neural networks. We will discuss a computable version of Kolmogorov's Superposition Theorem which states that each multivariate computable real-valued function can be represented as superposition and composition of one-dimensional computable real number functions. As a consequence we can characterize the computational power of feedforward neural networks with one hidden layer and computable activation functions precisely.