Geoffrey Hinton publishes new deep learning algorithm
Published on Jan 11, 2023
Geoffrey Hinton, professor at the University of Toronto and engineer at Google Brain, recently published a paper on the Forward-Forward algorithm (FF), a neural network training technique that uses two forward passes of data to update model weights instead of backpropagation.
Hinton’s algorithm addresses some of the shortcomings of backpropagation training which requires a full understanding of the forward pass computation to compute derivatives and store activation values. It was Hinton’s insight to use two forward passes of input data, one positive and one negative, that have opposite objective functions. Hinton showed that networks trained with FF could perform computer vision (CV) tasks about as well as those trained with backpropagation.
The standard backpropagation algorithm used to train artificial neural networks (ANNs) is not based on any known biological process. In addition to being biologically implausible, backpropagation has some computational drawbacks. ANNs can be trained using reinforcement learning (RL) without backpropagation, but “for large networks containing millions or billions of parameters, this technique scales poorly.” we covered zero-divergence inference learning (Z-IL) in 2021 as a biologically plausible alternative to backpropagation.
Hinton’s FF algorithm replaces the forward-backward passes of backpropagation training with two forward passes that “operate exactly the same way as each other.” First, a layer’s goodness value is increased by using positive input from a training set, and the network weights are adjusted accordingly. On the second forward pass, the network is given a generated negative example. This input decreases a layer’s goodness by adjusting the network weights.
Hinton trained several neural networks to perform CV tasks on MNIST and CIFAR datasets using FF. They contained two or three hidden convolutional layers, and were trained for less than 100 epochs. FF-trained networks performed “only slightly worse” on test datasets than backpropagation-trained networks.
GlassFish 7.0 Delivers Support for JDK 17 and Jakarta EE 10
Eclipse Foundation has released GlassFish 7.0, the latest version of its open-source application server…
Generating Text Inputs for Mobile App Testing Using GPT-3
Researchers at the Chinese Academy of Sciences and Monash University have developed a new method for…