⚡
Module 3 of 6
Activation Functions
Implement the nonlinearities that make neural networks work.
Progress0/3 drills · 0%
FREE~18 min·Beginner
What you'll learn
- ✓ReLU and its variants (Leaky ReLU, GELU)
- ✓Sigmoid and tanh
- ✓Softmax with numerical stability
Prerequisites
Drills
1
ReLU & Variants
Implement relu, leaky_relu, and gelu — the most common activation functions in modern networks.
Easy🕐 6m⚡ 10 pts
2
Sigmoid & Tanh
Implement sigmoid and tanh from scratch and verify the identity tanh(x) = 2*sigmoid(2x) - 1.
Easy🕐 6m⚡ 10 pts
3
Softmax
Implement numerically stable softmax for 1D and 2D arrays — the output layer of every classifier.
Medium🕐 8m⚡ 15 pts