Chapter 5

Neural Calculus

Abstract

A stable information theory of neural computation was introduced. It was suggested that this theory might be consistent with how learning of memory weights could be produced in neural models of dynamic spiking. It was noted that these models essentially have the same quantitative components as RELR including fast graded potentials built from polynomial terms to the fourth degree, interaction terms, and slower recovery currents that have similar properties as RELR's posterior memory weights which become prior weights in future learning. It was also noted that RELR's ability to learn patterns in very small samples of observations based upon very high-dimensional candidate features could be similar to what occurs ...

Get Calculus of Thought now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.