Two Paths to Intelligence – A Lecture by Geoffrey Hinton

Geoffrey Hinton, widely recognized as the “Godfather of AI,” made headlines when he revealed his departure from Google to openly discuss the potential dangers of the AI technology he helped pioneer. In a pivotal shift from his earlier optimism in 2014, Hinton expressed concerns about AI getting out of control and potentially surpassing human intelligence within the next two decades. He highlighted the advanced language understanding capabilities of large language models (LLMs), such as OpenAI’s ChatGPT, and how these models could rapidly accumulate knowledge, surpassing the capacity of individual human brains.

In a public lecture [see video below] at the University of Cambridge on May 25, 2023, Geoffrey Hinton explores the contrasting approaches to intelligence in digital and analog computation. Digital computers, designed for precise execution of instructions, have the advantage of running the same computation on different hardware but require high energy and precise fabrication. Hinton introduces the concept of abandoning the separation of software and hardware for computers that learn, opting for low-power analog computation that leverages the unique properties of specific hardware. He discusses a learning algorithm that can utilize analog properties without a detailed model, noting the mortal nature of such computations as knowledge is tied to the hardware’s lifespan.

On the other hand, digital computation, with its ability to run identical models on diverse hardware, facilitates efficient knowledge sharing through averaging weight changes. Hinton emphasizes the scalability of the backpropagation learning procedure in digital computation, suggesting that large-scale digital systems may outperform biological computation in acquiring knowledge, potentially surpassing human intelligence in the near future. This public lecture, organized by The Centre for the Study of Existential Risk, delves into these two paths to intelligence, shedding light on the evolving landscape of artificial and biological computation.

See the full lecture below: