News/Tech News

Geoffrey Hinton publishes new deep learning algorithm

Published on Jan 11, 2023

Geoffrey Hinton, professor at the University of Toronto and engineer at Google Brain, recently published a paper on the Forward-Forward algorithm (FF), a neural network training technique that uses two forward passes of data to update model weights instead of backpropagation.

Hinton’s algorithm addresses some of the shortcomings of backpropagation training which requires a full understanding of the forward pass computation to compute derivatives and store activation values. It was Hinton’s insight to use two forward passes of input data, one positive and one negative, that have opposite objective functions. Hinton showed that networks trained with FF could perform computer vision (CV) tasks about as well as those trained with backpropagation.

The standard backpropagation algorithm used to train artificial neural networks (ANNs) is not based on any known biological process. In addition to being biologically implausible, backpropagation has some computational drawbacks. ANNs can be trained using reinforcement learning (RL) without backpropagation, but “for large networks containing millions or billions of parameters, this technique scales poorly.” we covered zero-divergence inference learning (Z-IL) in 2021 as a biologically plausible alternative to backpropagation.

Hinton’s FF algorithm replaces the forward-backward passes of backpropagation training with two forward passes that “operate exactly the same way as each other.” First, a layer’s goodness value is increased by using positive input from a training set, and the network weights are adjusted accordingly. On the second forward pass, the network is given a generated negative example. This input decreases a layer’s goodness by adjusting the network weights.

Hinton trained several neural networks to perform CV tasks on MNIST and CIFAR datasets using FF. They contained two or three hidden convolutional layers, and were trained for less than 100 epochs. FF-trained networks performed “only slightly worse” on test datasets than backpropagation-trained networks.

Tech News

GlassFish 7.0 Delivers Support for JDK 17 and Jakarta EE 10

GlassFish 7.0 Delivers Support for JDK 17 and Jakarta EE 10

Eclipse Foundation has released GlassFish 7.0, the latest version of its open-source application server…

Generating Text Inputs for Mobile App Testing Using GPT-3

Generating Text Inputs for Mobile App Testing Using GPT-3

Researchers at the Chinese Academy of Sciences and Monash University have developed a new method for…

Our Latest Blog

Unlock Your Potential with a Level 5 Diploma in Business London's Top Courses img

Unlock Your Potential with a Level 5 Diploma in Business: London’s Top Courses

Are you looking to enhance your knowledge and skills in the field of business? Do...
Read More
Unlock Your Potential with Level 4 Diploma in Business Courses in London img

Unlock Your Potential with Level 4 Diploma in Business Courses in London

Are you looking for a comprehensive course to take your business career to the next...
Read More

Follow Us

Resources

Presentations
Browse LSET presentations to understand interesting…

Explore Now


eBooks
Get complete guides to empower yourself academically…

Explore Now


Infographics
Learn about information technology and business…

Explore Now