AI Foundations
Start here. Build a mental model of how LLMs work, from neural networks to attention.
- Step 1
What Is a Large Language Model?
Understand what large language models are, how they predict the next token, and why scale matters.
beginner 7 min read - Step 2
Neural Networks Basics
Learn how neural networks learn patterns through layers, weights, and backpropagation.
beginner 8 min read - Step 3
Tokenization
InteractiveLearn how text is split into tokens, why subword tokenizers exist, and how tokenization affects LLM behavior and cost.
beginner 8 min read - Step 4
Embeddings & Semantic Search
InteractiveLearn how embeddings turn text into vectors and enable semantic search by finding meaning-based similarity instead of keyword matches.
intermediate 9 min read - Step 5
Transformer Architecture
Understand how Transformers use attention to process sequences in parallel and power modern LLMs.
intermediate 10 min read - Step 6
How Attention Mechanisms Work
InteractiveLearn how attention helps models decide what matters, from query-key-value math to multi-head behavior in modern transformers.
intermediate 12 min read - Step 7
Decoding & Sampling
Understand how token selection strategies control output quality, diversity, and consistency.
intermediate 9 min read