Skip to content

HI160029/Activation_Function

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Activation_Function

I run the notMNIST dataset on 7 layers DNN and test with different activation functions. The confusion matrix, classification report and AUC score are generated.

Activation functions:

  1. ReLU
  2. Swish
  3. Tanh
  4. LReLU
  5. LReLU (0.25)
  6. PReLU
  7. Softplus
  8. ELU
  9. FReLU (initialized at -0.398)
  10. FReLU
  11. Flatten-T Swish

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages