Statements (20)
Predicate | Object |
---|---|
gptkbp:instanceOf |
activation function
|
gptkbp:advantage |
computationally efficient
can cause dying ReLU problem helps mitigate vanishing gradient problem |
gptkbp:defaultActivation |
many convolutional neural networks
|
gptkbp:differentiable |
almost everywhere
|
gptkbp:form |
f(x) = max(0, x)
|
gptkbp:fullName |
gptkb:Rectified_Linear_Unit
|
https://www.w3.org/2000/01/rdf-schema#label |
ReLU
|
gptkbp:introducedIn |
1980s
|
gptkbp:nonlinearity |
yes
|
gptkbp:range |
[0, ∞)
|
gptkbp:usedIn |
gptkb:artificial_neural_networks
deep learning |
gptkbp:variant |
gptkb:Exponential_Linear_Unit
gptkb:Leaky_ReLU gptkb:Parametric_ReLU |
gptkbp:bfsParent |
gptkb:model
gptkb:convolutional_neural_network |
gptkbp:bfsLayer |
5
|