σ
Activation Functions
Compare neural network activation functions side by side. Visualize curves, derivatives, and key properties.
Presets:
Properties
ReLU
Range[0, +inf)
MonotonicYes
Zero-centeredNo
DifferentiableNot at 0
Dead neuronsYes
CostLow
Default for hidden layers in most networks
GELU
Range(-0.17, +inf)
MonotonicNo
Zero-centeredNo
DifferentiableEverywhere
Dead neuronsNo
CostMedium
Default in Transformers (BERT, GPT)
Toggle functions and switch to derivative view. All computation runs entirely in your browser.