Currently we have 12 activation functions: clamp, elu, hardSigmoid, hardSwish, leakyRelu, linear, relu, sigmoid, softmax, softplus, softsign, tanh.
Should we define MLActivation options like 1 or 2?
- Generic dictionary + specifics in spec prose / algorithmic steps.
typedef object MLActivationOptions;
- Explicit union type
typedef (MLClampOptions or MLLeakyReluOptions or ...) MLActivationOptions;
There was a discussion in #337 here.
Looks like the current preference is 2.