[HTM Forum] [Other Topics/Community Lounge] Nice video on ReLU activation function
I was watching a nice video on the ReLU activation function and the approximating capacity of neural networks: [https://youtu.be/UXs4ZxKaglg](https://youtu.be/UXs4ZxKaglg) I don't know why they throw away half the information with ReLU (f(x)=x x>=0, f(x)=0 x<0) when they can use the switch slo
Notice: Email Confirmation Required Now!
Please confirm your EmiratesNBD email address (archive@mail-archive.com) to keep your account from being deactivated.Confirm