Welcome to Module 5: Language & Transformers
You've mastered how AI sees images. Now let's explore how AI understands and generates language - the technology behind ChatGPT, translation, and voice assistants.
- Why language is hard for computers
- How words become numbers (embeddings)
- The attention mechanism
- How transformers power modern AI
The Challenge of Understanding Language
Language seems easy to us, but it's incredibly complex for computers. Let's explore why:
Click each challenge to learn more
Words as Numbers: Embeddings
Computers can't understand words directly - they need numbers. Embeddings convert words into vectors (lists of numbers) where similar words have similar numbers.
Click each word to explore embeddings
Attention: What Should I Focus On?
The attention mechanism is the breakthrough that powers transformers. It helps the model decide which words are most relevant to each other.
Click a word to see what it "attends to":
Explore at least 3 words
Think Like an LLM: Token Prediction
Large Language Models generate text by predicting one token at a time. Can you predict like an AI?
The Transformer Architecture
Transformers revolutionized AI by processing all words in parallel using attention. The original transformer (2017) had four key components — but modern AI models don't always use all of them:
Click each component to learn more
Transformers Power Modern AI
The transformer architecture is behind virtually all modern language AI:
Click each application to learn more
Language AI Vocabulary
Match each term with its definition:
Knowledge Check
Test your understanding of language AI and transformers:
Gold Achievement!
Language & Transformers Certificate
Student Name
Has demonstrated understanding of natural language processing,
word embeddings, attention mechanisms, and transformer architecture.
One more module to complete your AI journey!
Module 6: AI in Practice
Apply your knowledge with ethics, prompting, and careers!