๐
Module 2 of 6
Vectors & Matrix Operations
Understand the math that powers attention, embeddings, and neural networks.
Progress0/3 drills ยท 0%
FREE~25 minยทBeginner
What you'll learn
- โDot product and cosine similarity
- โMatrix multiplication from scratch
- โTranspose and QKแต computation
Prerequisites
Drills
1
Dot Product & Cosine Similarity
Implement dot product from scratch (no np.dot) and cosine similarity โ used everywhere in embeddings.
Easy๐ 8mโก 10 pts
2
Matrix Multiplication
Implement matmul using loops, then apply a weight matrix to a batch of vectors โ the core operation of neural networks.
Medium๐ 10mโก 15 pts
3
Transpose & QKแต
Implement transpose for 2D and batched 3D tensors, then compute QKแต โ the exact operation in attention.
Medium๐ 8mโก 15 pts