Biological bath normalisation: How intrinsic plasticity improves learning in deep neural networks

dc.contributor.authorShaw, Nolan Peter
dc.contributor.authorJackson, Tyler
dc.contributor.authorOrchard, Jeff
dc.date.accessioned2026-05-06T17:21:19Z
dc.date.available2026-05-06T17:21:19Z
dc.date.issued2020-09-23
dc.description© 2020 Shaw et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
dc.description.abstractIn this work, we present a local intrinsic rule that we developed, dubbed IP, inspired by the Infomax rule. Like Infomax, this rule works by controlling the gain and bias of a neuron to regulate its rate of fire. We discuss the biological plausibility of the IP rule and compare it to batch normalisation. We demonstrate that the IP rule improves learning in deep networks, and provides networks with considerable robustness to increases in synaptic learning rates. We also sample the error gradients during learning and show that the IP rule substantially increases the size of the gradients over the course of learning. This suggests that the IP rule solves the vanishing gradient problem. Supplementary analysis is provided to derive the equilibrium solutions that the neuronal gain and bias converge to using our IP rule. An analysis demonstrates that the IP rule results in neuronal information potential similar to that of Infomax, when tested on a fixed input distribution. We also show that batch normalisation also improves information potential, suggesting that this may be a cause for the efficacy of batch normalisation—an open problem at the time of this writing.
dc.identifier.urihttps://doi.org/10.1371/journal.pone.0238454
dc.identifier.urihttps://hdl.handle.net/10012/23225
dc.language.isoen
dc.publisherPublic Library of Science
dc.relation.ispartofseriesPLoS ONE; 15(9); e0238454
dc.relation.urihttp://yann.lecun.com/exdb/mnist/
dc.relation.urihttps://www.cs.toronto.edu/~kriz/cifar.html
dc.rightsAttribution 4.0 Internationalen
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subjectmachine learning
dc.subjectneurons
dc.subjectlearning curves
dc.subjectentropy
dc.subjectneuronal plasticity
dc.subjectartificial neural networks
dc.subjectsingle neuron function
dc.subjectmachine learning algorithms
dc.titleBiological bath normalisation: How intrinsic plasticity improves learning in deep neural networks
dc.typeArticle
dcterms.bibliographicCitationShaw NP, Jackson T, Orchard J (2020) Biological batch normalisation: How intrinsic plasticity improves learning in deep neural networks. PLoS ONE 15(9): e0238454. https://doi.org/10.1371/journal.pone.0238454
uws.contributor.affiliation1Faculty of Mathematics
uws.contributor.affiliation2David R. Cheriton School of Computer Science
uws.peerReviewStatusReviewed
uws.scholarLevelFaculty
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
file (3).pdf
Size:
2.19 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
4.47 KB
Format:
Item-specific license agreed upon to submission
Description: