Wednesday 16 January 2019

TensorFlow Keras Activation Functions


keras.activations.softmax     (x, axis  = -1 )
keras.activations.elu         (x, alpha = 1.0)
keras.activations.selu        (x)
keras.activations.softplus    (x)
keras.activations.softsign    (x)
keras.activations.relu        (x, alpha = 0.0, max_value=None, threshold=0.0)
keras.activations.tanh        (x)
keras.activations.sigmoid     (x)
keras.activations.hard_sigmoid(x)
keras.activations.exponential (x)
keras.activations.linear      (x)

Reference: keras.io

No comments:

Post a Comment