Elementary students learning with classroom technology

๐Ÿง  Practical AI literacy for Kโ€“6 classrooms.

Everything you need to teach students how AI thinks โ€” not just how to use it. Kid-friendly definitions, classroom-tested assignments, a full framework aligned to how California districts are rolling out AI literacy, and honest answers to the questions teachers are actually asking.

This page is designed to be read straight through. Teachers should walk away with: (1) a shared language for talking about AI with students, (2) three assignments ready to use in class, and (3) a clear position on where AI fits in instruction and where it doesn’t.

๐Ÿงฉ The Framework: Three Types of AI, One Principle

AI isn’t one thing. Districts that roll out AI literacy well give students a taxonomy they can use โ€” a way to categorize what they’re encountering. The clearest framing, and the one California districts are converging on, is a three-type breakdown.

๐ŸŽจ

Generative AI

Makes new things โ€” text, images, audio, code โ€” in response to a prompt.

Examples: ChatGPT, Google Gemini, Claude, image and music generators, Brisk for lesson plans.

๐Ÿ”ฎ

Predictive AI

Spots patterns and guesses what comes next. The AI hiding in plain sight.

Examples: Spotify and YouTube recommendations, Google Maps traffic, social media feeds, iReady adaptive practice.

๐Ÿค–

Agentic AI

Acts on its own to complete multi-step tasks. Emerging category.

Examples: Most Kโ€“6 students won’t meet true agentic AI this year โ€” but they’ll hear the term, and see more of it next year.

โœ‹ Human-centered AI: The staff is the expert. AI is the tool.

Every AI-assisted decision in education starts with human expertise and ends with human judgment. The AI never makes the final call. This is often called “human in the loop” (HITL) โ€” a term worth teaching students directly.

This principle matters more than any specific tool. Tools change every six months. The principle โ€” that educators and students stay in charge of the learning, and AI supports rather than replaces that work โ€” is what makes AI literacy durable.

๐Ÿ“š Kid-Friendly Glossary

Language is half the battle. Students can’t think clearly about AI if they don’t have words for it. These ten terms cover what a Kโ€“6 student needs to navigate AI in their world. Each card has: a student-facing definition, a teacher note on common misconceptions, and a try-this classroom example.

๐Ÿ’ฌ

Chatbot

For kids: A computer program you can talk with in words. You type, it writes back. Examples: ChatGPT, Google Gemini.

Teacher: Students often think a chatbot “knows” things. It doesn’t โ€” it predicts plausible words.

๐ŸŽฏ Try this: “Ask Gemini a question you know the answer to. Then ask one you don’t. How do you know which to trust?”

๐ŸŽจ

Generative AI

For kids: AI that makes new things โ€” stories, pictures, songs, code โ€” based on what you ask for.

Teacher: “New” is misleading. Generative AI recombines patterns from training data. Nothing it makes is truly original.

๐ŸŽฏ Try this: Students write a short story. AI writes one from the same prompt. Compare: what feels familiar? What does yours have that AI can’t?

๐Ÿ”ฎ

Predictive AI

For kids: AI that looks at patterns and guesses what might happen or what you might like next.

Teacher: This is the AI hiding in plain sight. Kids use it more than generative AI but name it less.

๐ŸŽฏ Try this: “Open YouTube Kids. Why is that video on your home screen? What did YouTube learn about you to put it there?”

๐Ÿ’ญ

Hallucination

For kids: When AI says something confidently but it isn’t true. AI doesn’t know it’s wrong. That’s why people always check.

Teacher: The single most important word on this list. Fluent wrongness is AI’s default failure mode.

๐ŸŽฏ Try this: Ask AI for “5 books about [a very specific niche].” Google each title. Count how many don’t exist. Sticks for years.

๐Ÿ“–

Large Language Model (LLM)

For kids: AI that has “read” billions of words. It uses what it learned to guess the next word when writing or answering.

Teacher: The word “guess” matters. LLMs are probability engines, not knowledge engines.

๐ŸŽฏ Try this: Give students “The cat sat on the ___.” What comes next? That’s all an LLM does, just at scale.

๐Ÿ”

Pattern

For kids: Something that repeats. AI’s main job is finding patterns โ€” in pictures, words, or behavior โ€” and using them to guess what comes next.

Teacher: Pattern recognition is the conceptual core of AI. Teaching math patterns (AB, ABB) is already AI prep.

๐ŸŽฏ Try this: Show ๐Ÿ• ๐Ÿˆ ๐Ÿ• ๐Ÿˆ ๐Ÿ• ___. What comes next? How do you know? That’s what AI does, with more data.

๐Ÿ’ก

Prompt

For kids: The question or instruction you give AI. A clear prompt usually gets a better answer.

Teacher: Prompt quality mirrors writing quality. Teaching clear prompting is teaching clear thinking.

๐ŸŽฏ Try this: Compare “write a story” vs “write a 5-sentence story about a kid who loses their dog at the park.” Side by side.

๐Ÿ“Š

Training Data

For kids: The examples AI learned from. If AI only saw one type of dog, it won’t know other dogs well. Training data shapes what AI can do.

Teacher: This is where bias lives. AI doesn’t have opinions โ€” it has patterns from whoever’s data it was trained on.

๐ŸŽฏ Try this: Train the Robot assignment (below). “If a robot only saw tiny white dogs, what mistake might it make with a Great Dane?”

โš™๏ธ

Algorithm

For kids: A set of rules a computer follows, step by step, to solve a problem or make a choice.

Teacher: Students hear this word daily (“the algorithm showed me…”). Naming it demystifies it.

๐ŸŽฏ Try this: Walk through making a PB&J as a numbered algorithm. Then ask: what does TikTok’s algorithm do? Just longer.

๐ŸŽญ

Deepfake

For kids: A video, picture, or audio that AI made to look or sound like a real person โ€” but isn’t really them.

Teacher: No longer advanced curriculum. Upper-elementary kids have seen deepfakes, often without knowing it.

๐ŸŽฏ Try this: Show age-appropriate examples. “What clues tell you it’s not real? What if there were no clues?”

๐Ÿง‘โ€๐Ÿซ Three Classroom-Tested Assignments

Each assignment is short (20โ€“30 minutes), requires only paper, and works whether or not your school has AI tools available. None require students to touch AI โ€” the thinking happens before the tool.

๐Ÿค–
Kโ€“2

Train the Robot

Teaches: classification & training-data bias

Students look at training examples (birds, airplanes, butterflies) and answer: What pattern might the robot notice? What mistake might it make? The final question โ€” “If a robot only saw tiny white dogs, what could it miss?” โ€” lands training bias in 6-year-old language.

โฑ๏ธ 20 min  ยท  ๐Ÿ“‹ K-LS1-1, CCSS SL.K.1

๐ŸŽฏ
3โ€“5

How Does AI Choose?

Teaches: recommendation algorithms

Students work through four questions: “If you watch 3 cat videos, what shows next?” “AI looks for: patterns / nothing / magic?” “Can AI make mistakes?” “Where have I seen AI?” Then draw their own imagined video feed. Perfect AI Literacy Day warm-up.

โฑ๏ธ 25 min  ยท  ๐Ÿ“‹ CCSS SL.3.1, SL.4.1

โœ๏ธ
3โ€“6

What Did the AI Do?

Teaches: authorship & critical review

After watching an AI-generated video of their group’s story, students answer: What did AI get right? What did it change? Who is the author โ€” the students, the AI, or both? That last question is where kids start arguing. That’s the point.

โฑ๏ธ 30 min  ยท  ๐Ÿ“‹ CCSS W.3.3, SL.3.3

PDFs and printable versions coming soon. The assignments can be recreated on any worksheet with standard elementary formatting.

๐Ÿซ Where AI Shows Up in a School Day

One of the most practical things a teacher can do is help students see AI where it already lives. Most kids think “AI” means “ChatGPT.” That’s wrong โ€” and the wrongness blocks learning.

Here’s a rough inventory of a typical elementary student’s day:

๐ŸŒ…
Morning. Autocomplete when they type. Google search suggestions. Voice assistants. All predictive AI.

๐Ÿ“
ELA block. Spelling and grammar in Google Docs. Read-aloud features. Any adaptive reading tool. Mostly predictive; grammar correction is light generative.

๐Ÿ”ข
Math block. iReady personalization, adaptive practice platforms, ST Math’s problem selection. All predictive โ€” matching the student’s pattern to similar students’ patterns.

๐ŸŽฎ
Recess / social. YouTube Kids recommendations, TikTok feeds (older siblings’ phones), Roblox matchmaking. Predictive, dense, constant.

๐Ÿ”
Research / inquiry. Search engine ranking (predictive). Any class Gemini or Brisk demo (generative). Image search (predictive).

๐ŸŒ™
After school. Streaming recommendations. Autocorrect on family phones. Navigation. Voice assistants. Game AI.

Pointing these out across a week transforms how students hear the word “AI.” It stops meaning “a robot that writes essays” and starts meaning “a kind of software that’s already in most of what I touch.” That shift is foundational.

๐Ÿ—‚๏ธ Aligning With Your District’s AI Rollout

Most California districts are in year one or two of formal AI literacy programming. The approach emerging across districts โ€” OMSD, San Bernardino County, LAUSD, and others โ€” shares a few core commitments:

โœ…

Approved tools only.

Student data stays out of public AI. FERPA-compliant tools only: Google Gemini for Education, Brisk, NotebookLM (staff accounts). Student OMSD accounts typically don’t have Gemini.

๐Ÿ”’

No PII in prompts.

Names, IDs, birthdays, addresses never go in โ€” even in approved tools. De-identified prompts only. Most teachable data-privacy principle for elementary staff.

๐Ÿ“‹

Academic integrity applies.

District policies (typically BP 5131.9 in CA) extend to AI-generated work. Originality reports in Google Classroom and No Red Ink help flag concerns.

๐Ÿ› ๏ธ

AI is a tool, not a curriculum.

Gemini doesn’t replace scope and sequence. Every activity starts with human expertise and ends with human review.

The resources on this page complement those commitments โ€” they don’t replace them. If your district has an approved-tool list or its own glossary, use those as starting points and layer these lessons on top.

โ“ Teacher FAQ

Should students be using AI tools in my classroom?

Depends on your district’s policy, grade level, and tool. Most Kโ€“3 classrooms should not have students typing prompts into chatbots. Grades 4โ€“6 with approved tools and teacher supervision โ€” yes, for specific, bounded tasks. The lessons on this page don’t require students to touch AI at all; the thinking happens before the tool.

How do I catch AI-written student work?

Detection tools (Turnitin’s AI scanner, GPTZero) have meaningful false-positive rates, especially for multilingual students. Evidence suggests they’re unreliable as sole proof. The more durable approach is designing assignments AI can’t easily complete โ€” assignments that reference classroom-specific content, require in-class drafting, or ask students to show their thinking. Process > product.

What if a student asks AI something inappropriate?

Approved school tools have guardrails, but they’re imperfect. Set expectations explicitly: “If something comes up that doesn’t feel right, close it and tell me.” Model this by walking students through an AI response occasionally and saying out loud what you would question.

Is it okay to use AI to grade or write feedback?

See our article on ethics of AI report cards. Short version: drafting is reasonable if you review and personalize every comment. Letting AI generate and send feedback without review is not.

What do I say when parents ask why we’re teaching this?

Two sentences: “Our students are already using AI in their daily lives โ€” recommendation feeds, autocomplete, image generators. We’re teaching them to think critically about it so they’re making choices, not being shaped by systems they don’t understand.” Most parents find that reassuring.

What if my principal isn’t on board?

Start small. The three assignments on this page require no AI tools, no district approval, and no technology. They’re critical-thinking lessons that happen to name AI. Most principals support critical-thinking instruction.

I have ELL and SpEd students โ€” does this work for them?

Yes, with scaffolds. For ELL: front-load the vocabulary (the glossary helps), use visual examples first, allow native-language discussion of concepts. For SpEd: reduce choice load, use the same lesson structure multiple times before introducing a new one, and pair visual classification with verbal explanation.

๐Ÿ“ Standards Alignment

AI4K12 “Five Big Ideas” โ€” the national consensus framework. Perception (Assignment 1). Representation & Reasoning (glossary “Pattern”). Learning (glossary “Training Data”). Natural Interaction (glossary “Chatbot,” “LLM”). Societal Impact (Assignment 3, the FAQ).

NGSS Crosswalks โ€” K-LS1-1 (observing patterns in living things) โ†’ pattern recognition. 3-PS2-1 (cause and effect in patterns) โ†’ Assignment 2. 5-PS1-3 (observing to classify) โ†’ Assignment 1.

CCSS ELA โ€” Speaking & Listening (Kโ€“5 SL.1, 3.1, 4.1, classroom discussion). Writing narratives (W.3.3, W.4.3 โ†’ Assignment 3). Analyzing speaker’s purpose (SL.3.3 โ†’ Assignment 3 authorship discussion).

California CDE AI Guidelines โ€” Human-centered framing. Privacy-protective. Age-appropriate. Critical-literacy-focused.

๐Ÿ“ฐ Related Reading on EdTech Institute

๐ŸŒ External Resources

  • AI4K12 โ€” the Kโ€“12 AI curriculum framework (Five Big Ideas). Free, actively maintained, classroom-ready.
  • California Department of Education โ€” AI Guidelines โ€” non-mandatory but widely adopted. The framing most CA districts are converging on.
  • Common Sense Education AI โ€” reviews of AI tools for classroom safety and age-appropriateness.
  • ISTE AI Standards for Students โ€” competency framework for student AI literacy.
  • MIT RAISE โ€” Responsible AI for Social Empowerment in Education. Elementary-focused materials.

Maintained by EdTech Institute’s editorial team with input from California Kโ€“6 classroom teachers. Questions, corrections, or district collaboration? Get in touch.