AI Scientists Awarded Nobel Prize in Physics for Groundbreaking Work in Machine Learning

AI Scientists Awarded Nobel Prize in Physics for Groundbreaking Work in Machine Learning

Pioneers in Artificial Neural Networks Lay the Foundation for Modern Machine Learning

John Hopfield of Princeton University and Geoffrey Hinton of the University of Toronto have been awarded this year’s Nobel Prize in Physics.

The two scientists used principles from physics to develop methods that form the foundation of today's powerful machine learning technologies, which are a branch of artificial intelligence. Their work was fundamental in shaping this field before the rise of what is now known as generative AI.

Hopfield created an associative memory that can store and reconstruct images and other types of patterns from data. Hinton invented a method that autonomously identifies features in data, enabling tasks such as detecting specific elements in images.

When we speak about artificial intelligence, we often refer to machine learning using artificial neural networks. This technology was initially inspired by the structure of the brain. In an artificial neural network, neurons are represented by nodes with different values. These nodes influence each other through connections that can be compared to synapses, which can become stronger or weaker over time. The network is trained by reinforcing connections between nodes with high simultaneous values. The Nobel laureates have significantly contributed to the development of artificial neural networks since the 1980s.

John Hopfield invented a network that uses a method to store and recreate patterns. The nodes in this network can be imagined as pixels. The Hopfield network applies principles from physics that describe the properties of a material based on the spin of its atoms—a feature that makes each atom a tiny magnet. The network as a whole is described in a way analogous to energy in a spin system, as found in physics. It is trained by finding values for the connections between the nodes so that the stored images have low energy. When the Hopfield network is fed a distorted or incomplete image, it systematically adjusts the nodes' values to reduce the energy of the network. This process allows the network to gradually retrieve the stored image that most closely resembles the incomplete one provided.

Geoffrey Hinton built on Hopfield’s work by developing a new type of network called the Boltzmann machine, which uses a different method. This machine can learn to recognize characteristic features in a given set of data. Hinton applied tools from statistical physics, the study of systems composed of many similar elements. The machine is trained by providing it with examples that are highly probable under its operating conditions. The Boltzmann machine can be used to classify images or generate new examples of the pattern type on which it was trained. Hinton’s contributions have further fueled the explosive growth of machine learning today.

Loader