Statements (23)
| Predicate | Object |
|---|---|
| gptkbp:instanceOf |
gptkb:activation_function
|
| gptkbp:advantage |
computationally efficient
can cause dying ReLU problem helps mitigate vanishing gradient problem |
| gptkbp:defaultActivation |
many convolutional neural networks
|
| gptkbp:differentiable |
almost everywhere
|
| gptkbp:form |
f(x) = max(0, x)
|
| gptkbp:fullName |
gptkb:Rectified_Linear_Unit
|
| gptkbp:introducedIn |
1980s
|
| gptkbp:nonlinearity |
yes
|
| gptkbp:range |
[0, ∞)
|
| gptkbp:usedIn |
gptkb:artificial_neural_networks
deep learning |
| gptkbp:variant |
gptkb:Exponential_Linear_Unit
gptkb:Leaky_ReLU gptkb:Parametric_ReLU |
| gptkbp:bfsParent |
gptkb:GCN
gptkb:U-Net gptkb:feedforward_neural_network gptkb:DCGAN gptkb:GCN_architecture |
| gptkbp:bfsLayer |
6
|
| https://www.w3.org/2000/01/rdf-schema#label |
ReLU
|