AI Foundations

Start here. Build a mental model of how LLMs work, from neural networks to attention.

Concepts 7
Total time ~63 min
Interactive 3
  1. Step 1

    What Is a Large Language Model?

    Understand what large language models are, how they predict the next token, and why scale matters.

    beginner 7 min read
  2. Step 2

    Neural Networks Basics

    Learn how neural networks learn patterns through layers, weights, and backpropagation.

    beginner 8 min read
  3. Step 3

    Tokenization

    Interactive

    Learn how text is split into tokens, why subword tokenizers exist, and how tokenization affects LLM behavior and cost.

    beginner 8 min read
  4. Step 4

    Embeddings & Semantic Search

    Interactive

    Learn how embeddings turn text into vectors and enable semantic search by finding meaning-based similarity instead of keyword matches.

    intermediate 9 min read
  5. Step 5

    Transformer Architecture

    Understand how Transformers use attention to process sequences in parallel and power modern LLMs.

    intermediate 10 min read
  6. Step 6

    How Attention Mechanisms Work

    Interactive

    Learn how attention helps models decide what matters, from query-key-value math to multi-head behavior in modern transformers.

    intermediate 12 min read
  7. Step 7

    Decoding & Sampling

    Understand how token selection strategies control output quality, diversity, and consistency.

    intermediate 9 min read