Director, AI Initiative, KAUST; Scientific Director, Swiss AI Lab, IDSIA; Adj. Prof. of AI, Univ. Lugano; Co-Founder & Chief Scientist, NNAISENSE

Professor Jürgen Schmidhuber has been working toward creating an AI that is more intelligent than he is since he was around 15 years old. After accomplishing this aim, he plans to retire. Deep Learning Neural Networks (NNs) developed in his lab have revolutionized AI and machine learning. His team's CTC-trained Long Short-Term Memory (LSTM) was the first recurrent NN to take first place in international pattern recognition challenges in 2009. Fast and deep feedforward NNs on GPUs developed in his group in 2010 significantly outperformed earlier techniques without the need for unsupervised pre-training, a well-known deep learning technique that he invented in 1991. His team's DanNet was the first feedforward NN to triumph in computer vision competitions in 2011, displaying superhuman capability. By the middle of 2010, his lab's NNs were being used by users of the most valuable public companies in the world on 3 billion devices and billions of times every day. For example, they significantly improved speech recognition on most smartphones, machine translation on Facebook and Google Translate (over 4 billion LSTM-based translations per day), Apple's Siri and Quicktype on iPhones, the responses of Amazon's Alexa, and many other applications. The Highway Net, the first operational truly deep feedforward NN with hundreds of layers, was published by his team in May 2015. Its open-gated version, named ResNet, was released in December 2015 and has since become the most referenced NN of the 21st century, surpassing LSTM (Bloomberg called LSTM the arguably most commercial AI achievement). The NNs developed in his lab are widely used in healthcare and medicine, extending and enhancing human life. He developed artificial curiosity-inducing meta learning machines that have been around since 1987, unsupervised generative adversarial neural networks that compete against one another in a minimax game, and neural fast weight programmers, which are formally equivalent to Transformers with linearized self-attention, which were developed in 1991. He explains art, science, music, and comedy using his formal theory of creativity, curiosity, and enjoyment. He also proposed the idea of Low-Complexity Art, the ultimate form of minimal art for the information age, as well as the many-worlds theory of physics and algorithmic information theory. He is the Chief Scientist of the business NNAISENSE, which aspires to create the first useful general-purpose AI and is a winner of multiple prizes. He often gives keynote addresses and provides government clients with AI strategy advice.