Home/Courses/AI Builder/AI Builder: AI Math & Vector Dynamics
Beginner CoursePart of AI Builder

AI Builder: AI Math & Vector Dynamics

Master the mathematics that powers every AI system. Over four weeks, go from linear algebra and probability through transformer mechanics and distance metrics — building a semantic search engine from scratch as your capstone.

No rating yet
4 weeks

What You'll Learn

Represent and manipulate vectors and matrices using NumPy
Implement Softmax, temperature scaling, and nucleus sampling from scratch
Simulate gradient descent and understand how transformers learn
Calculate Query, Key, and Value matrices and multi-head attention
Build a semantic search engine using cosine similarity and KNN

Course Content

W1
Week 1: Linear Algebra for Embeddings
Understand the mathematical objects embeddings actually are.
1
Vectors, Matrices & NumPy
Representing data as vectors and matrices in NumPy — the foundational objects behind every embedding and weight tensor.
2
Matrix Operations
Dot products, matrix multiplications, and transposes — the core computations inside attention layers and dense networks.
3
Solving Systems of Equations
Using matrix inversion and decomposition to solve linear systems, the mathematical basis for model parameter estimation.
4
High-Dimensional Spaces & Determinants
Intuitions for why embeddings live in 768- or 1536-dimensional space and how determinants measure transformation scale.
Weekly Win
Vector Operations in Python
Implement a suite of vector operations in pure Python and NumPy, then verify each result matches the mathematical definition.
W2
Week 2: Probability & Token Sampling
Understand exactly how an LLM decides its next token.
1
Probability Distributions & Standard Deviation
How probability distributions describe uncertainty and why standard deviation matters for understanding model confidence.
2
Softmax & Logit Normalization
Converting raw logit scores into normalized probabilities using Softmax — the final step of every language model forward pass.
3
Temperature Scaling
Tuning the temperature parameter to control the trade-off between deterministic and creative token sampling.
4
Top-K & Top-P Nucleus Sampling
Restricting the token sampling pool using Top-K cutoffs and Top-P nucleus thresholds to balance quality and diversity.
Weekly Win
Advanced Sampling Techniques
Implement Min-p sampling and repetition penalty from scratch, comparing output diversity against standard Top-P sampling.
W3
Week 3: The Mechanics of Transformers
See inside the architecture that powers every modern LLM.
1
Calculus Fundamentals for Optimization
Derivatives, chain rule, and partial derivatives — the mathematical tools that make backpropagation work.
2
Simulating Gradient Descent
Iteratively minimizing a loss function by hand to build an intuitive understanding of how neural networks learn.
3
Additive, Multiplicative & Self-Attention
Three attention mechanisms compared — understanding why scaled dot-product self-attention became the dominant paradigm.
4
Query, Key & Value Matrices
The mathematical derivation of Q, K, V matrices and how they implement a differentiable soft lookup table.
Weekly Win
Multi-Head Attention & Positional Encoding
Implement multi-head attention and sinusoidal positional encoding in NumPy, producing the correct output dimensions for a transformer block.
W4
Week 4: Distance Metrics & Capstone
Measure semantic similarity — the engine behind every vector search.
1
Euclidean Distance in NumPy
Calculating L2 distance between vectors and understanding when geometric closeness maps to semantic similarity.
2
Cosine Similarity
Measuring the angle between vectors to capture semantic relevance independently of magnitude.
3
K-Nearest Neighbors
Building a KNN algorithm to retrieve the most semantically relevant results from a vector index.
4
Evaluating Similarity — Precision & Recall
Measuring retrieval quality using Precision and Recall to determine whether the right results are being returned.
Weekly Win
Capstone: Semantic Search Engine
Build a semantic search engine from scratch — embed a document corpus, index with KNN, and evaluate retrieval quality using Precision and Recall.

Prerequisites

High school mathematics
Basic Python knowledge
📚
Beginner Level
Course Price
9,999
India
$199
International · One-time payment
Next cohort starts Mar 30
Duration4 weeks
LevelBeginner
FormatCohort-based
Modules4

What's included:

Live cohort sessions
Hands-on projects
Certificate of completion
Lifetime access
Career support

Part of Learning Track

🛠️
AI Builder
6 courses in track