AI Ethics in Education

AI Ethics in Education

AI in education isn’t a future conversation. It’s happening right now, in classrooms at every level, and we’re mostly making it up as we go. After twenty years building education programs, I have some thoughts — but I want to be honest that I don’t have neat answers.

The access problem

AI tools cost money. The good ones cost more. ChatGPT Plus, Claude Pro, Copilot — these are subscription products. Schools and families that can afford them get access to genuinely powerful learning tools. Schools and families that can’t are falling further behind.

We’ve seen this pattern before with technology in education. Laptops, internet access, software licenses — each wave widened the gap before eventually narrowing it. But “eventually” can mean a generation of kids. The question is whether we’re being intentional about access or just hoping it works out.

The authenticity problem

When a student uses AI to write an essay, what did they learn? This is the question that keeps teachers up at night, and the honest answer is: it depends entirely on how the assignment was designed.

Banning AI doesn’t work. Students will use it anyway, and more importantly, they should learn to work with these tools — that’s a real skill they’ll need. But blindly allowing AI without rethinking what we’re asking students to do doesn’t work either. If the assignment can be fully completed by AI, maybe the assignment needs to change.

The best teachers I know are redesigning their assignments so that AI is a tool in the process, not a replacement for the thinking. The essay isn’t the product — the thinking is the product. AI can help with the essay. It can’t do the thinking for you.

The teacher question

AI can grade papers, generate practice problems, personalize learning paths, and answer student questions at 2am. These are genuinely useful capabilities. But the thing that makes education work — the human relationship between a teacher and a student — is not something AI can replicate.

The best use of AI in education is giving teachers more time and better tools so they can focus on the work that only humans can do: inspiring curiosity, building confidence, recognizing when a student is struggling in ways that don’t show up in their answers.

What I think we should actually do

At Jumpstart Lab, this shapes everything we build. Our workshops don’t just teach people to generate AI output — they teach people to evaluate it. To know when Claude is right, when it’s confidently wrong, and when the answer requires judgment that no model can provide.

The goal isn’t AI fluency for its own sake. It’s critical thinking about a tool that’s going to be everywhere. We owe it to students, teachers, and professionals to approach this thoughtfully — not with fear, not with hype, but with the honest work of figuring it out together.


Jeff Casimir
Jeff Casimir
Principal, Jumpstart Lab
jeff@jumpstartlab.com

Does your team need help rolling Claude Code into everyday work? Through workshops and coaching, I can help them reach their potential.

Book a Conversation