ReLU as a literal switch

Started by
0 comments, last by S6Regen 4 years, 2 months ago

The ReLU neural network activation function as a literal switch.

The variance equation for linear combinations of random variables as a route to a general associative memory algorithm.

https://ai462qqq.blogspot.com/2019/11/artificial-neural-networks.html

This topic is closed to new replies.

Advertisement