Thursday, October 19, 2023

Experiment

 Looking to designing little tests for myself, to better appreciate

thre parameters at work' Here, the different activation functions:


                                                                   


                                                                     ReLU:              


Sigmoid:


Tanh:

Performed a simple classification task using each activation function 

in turn...Am getting horizontal and vertical data as inputs...

                                                                      








Both ReLU and Tanh made it to no losses, tabh more quickly than ReLU. 

Sigmoid got caught in .001 purgatory on both train and test!!

The model is too simple to get past adding noise 20.

No comments: