Dandelion's activation module is mostly inherited from Lasagne except for the softmax() and log_softmax() activations.

You're recommended to refer to Lasagne.nonlinearities document for the following activations:

  • sigmoid
  • tanh
  • relu
  • softplus
  • ultra_fast_sigmoid
  • ScaledTanH
  • leaky_rectify
  • very_leaky_rectify
  • elu
  • SELU
  • linear
  • identity

softmax

Apply softmax to the last dimension of input x

softmax(x)
  • x: theano tensor of any shape

log_softmax

Apply softmax to the last dimension of input x, in log domain

log_softmax(x)
  • x: theano tensor of any shape