The Augmented Educator

The Augmented Educator

Asynchronous and Embodied Models

The Detection Deception, Chapter 8

Michael G Wagner's avatar
Michael G Wagner
Nov 08, 2025
∙ Paid
Upgrade to paid to play voiceover

Fellow Augmented Educators,

Welcome to week eight of ‘The Detection Deception’ book serialization. New chapters appear here for paid subscribers each Saturday.

This week’s chapter moves from the philosophical why to the practical how. After last week established the theoretical foundations for a pedagogy of process, this chapter provides a set of tangible, AI-resistant assessment frameworks built upon that very foundation. It argues that if we are to escape the detection arms race, we must fundamentally shift what we grade. The chapter details three concrete models for achieving this:

  1. Process-based assessment (”grading the journey”),

  2. Embodied assessment (”the performance of understanding”), and

  3. Context-specific assessment (”the power of specificity”).

The chapter ultimately offers a pragmatic path forward. It argues that instead of trying to build a better “mouse trap” to detect AI, we should simply change the “cheese.” By designing assessments that value the very things AI cannot replicate we make generative AI irrelevant and, in the process, create a more authentic, equitable, and meaningful way to measure human learning.

Thank you for reading along! See you in the comments.

Michael G Wagner (The Augmented Educator)


Chapter 8: Asynchronous and Embodied Models

The human desire to measure learning has always been at odds with learning’s essential nature as a process of transformation. We create endpoints where none naturally exist, draw boundaries around knowledge that refuses containment, and demand proof of understanding in forms that understanding itself resists. For decades, this tension remained manageable, a philosophical problem papered over by practical necessity. Students wrote essays, teachers graded them, and everyone pretended that these textual artifacts adequately represented the messy, recursive, deeply human process of coming to know something. Then, artificial intelligence arrived with the force of a revelation.

The crisis is also an opportunity. If machines can instantly generate the products we’ve been grading, perhaps it’s time to stop grading products and start recognizing process. If algorithms can mimic the surface features of understanding, perhaps we need to dig deeper, finding evidence of learning in places machines cannot reach. The following exploration examines three approaches to making the learning process itself visible and assessable. It considers how we might evaluate the journey rather than just the destination, how embodied performance reveals dimensions of understanding that text cannot capture, and how the specific, irreplaceable context of each classroom can become our greatest asset in preserving authentic assessment. These are not complete solutions but experiments in recognition, attempts to see and value the actual work of learning rather than its fossilized remains.

Grading the Journey, Not the Destination

The solution to our assessment crisis lies not in building better detection systems or engaging in an endless technological arms race, but in fundamentally re-conceptualizing what we choose to assess. The shift from product to process represents more than a tactical adjustment to the AI challenge. It reflects a deeper philosophical commitment to making learning visible in all its messy, iterative complexity. When we grade the journey rather than merely the destination, we create assessments that are not only resistant to AI substitution but also more authentic representations of genuine intellectual engagement.

Consider how human writing actually unfolds. A student confronting a complex assignment does not simply sit down and produce a perfect essay from beginning to end. The authentic writing process involves false starts, abandoned paragraphs, and conceptual breakthroughs that emerge through the act of writing itself. There are moments of confusion where ideas refuse to cohere, followed by sudden clarity when connections become apparent. Arguments grow, sometimes dramatically, as the writer discovers what they actually think through the process of trying to articulate it. This recursive, often chaotic journey of intellectual discovery cannot be replicated by an AI, which generates text through statistical pattern matching, rather than genuine cognitive struggle.

The process-oriented approach transforms this previously hidden work into assessable evidence of learning. Multiple drafts become windows into cognitive development. A first draft might reveal a student grappling with basic comprehension, organizing scattered observations into preliminary themes. The second draft shows refinement as stronger arguments emerge and weaker ones fall away. The final version shows not just the endpoint of thinking but the evolution of understanding. Each iteration carries what might be called a “cognitive fingerprint,” unique patterns of development that reflect an individual mind wrestling with ideas.

This approach requires educators to expand their conception of what constitutes assessable work. Process-based assessment recognizes the preparatory stages as valuable data about student learning. Research notes reveal how students evaluate and synthesize sources. And outline revisions show the development of organizational logic. Even abandoned attempts provide evidence of intellectual risk-taking and the willingness to pursue ideas that might not ultimately succeed.

The power of this approach becomes clear when we consider the role of peer review in the writing process. When peer review becomes a formal, graded component of the assignment, it transforms into something far more substantial. Students must show their understanding of course concepts not just through their own writing but through their ability to identify strengths and weaknesses in others’ work. The feedback a student provides becomes an assessable artifact in its own right.

Imagine a sociology course where students are writing about social inequality. Maria submits her first draft for peer review, and James provides detailed feedback. His comments reveal whether he understands the theoretical frameworks being applied. When he suggests that Maria’s analysis of intersectionality could be strengthened by considering economic factors alongside race and gender, he shows comprehension of how these systems interact. When he identifies a logical gap in her argument about structural barriers, he shows his ability to think critically about causation and evidence. The quality, specificity, and theoretical grounding of his feedback become powerful indicators of his own learning, entirely separate from his ability to produce a polished essay.

This exemplifies the Socratic principle in practice—understanding becomes visible through the act of questioning and examining ideas. The peer review process creates multiple benefits beyond AI resistance. Students receive feedback from multiple perspectives, broadening their understanding of how their ideas land with different readers. They develop metacognitive awareness by articulating what makes writing effective or ineffective. The social dimension of peer review also introduces accountability mechanisms that purely individualized assignments lack.

The revision plan represents perhaps the most powerful tool in the process-based assessment toolkit. After receiving feedback from peers or instructors, students must articulate how they plan to address the critiques and improve their work. This document forces students to engage in explicit metacognition, moving beyond simply making changes to understanding why those changes matter. A strong revision plan demonstrates several sophisticated cognitive moves. The student must accurately interpret feedback, distinguishing between central criticisms that require structural changes and minor suggestions for polish. And they must evaluate which feedback to incorporate and which to respectfully decline, exercising intellectual judgment.

Consider a hypothetical revision plan from a history student writing about the French Revolution. After receiving feedback that her essay lacks a clear argument about causation, she writes:

“The primary criticism of my first draft is that I present multiple factors contributing to the Revolution without establishing a hierarchy of causation or showing how these factors interacted. In my revision, I will restructure the essay around the thesis that economic crisis was the necessary condition that transformed long-standing social tensions into revolutionary action. This requires significant reorganization. I’ll move my discussion of Enlightenment ideas from the beginning to show how economic hardship made these ideas suddenly relevant to broader populations. The section on the nobility’s resistance to tax reform, currently buried in paragraph four, needs to become central to my argument about the fiscal crisis as the triggering mechanism.”

This revision plan reveals genuine intellectual work that cannot be outsourced to an AI. The student demonstrates an understanding of the difference between correlation and causation, the ability to synthesize multiple historical factors into a coherent narrative, and the metacognitive awareness to recognize and address weaknesses in her own thinking.

The reflective essay offers another powerful window into authentic learning processes. Rather than focusing on what students know, reflective writing reveals how they came to know it. These documents ask students to examine their own learning journey with questions that probe beyond surface comprehension.

A student in a philosophy course might write about struggling with the concept of moral relativism: “The most challenging part of this essay was moving beyond my initial instinct to simply reject relativism as obviously wrong. I found myself getting angry at the readings, which made me realize I was bringing strong assumptions I hadn’t examined. The breakthrough came when I tried to steel-man the relativist position, arguing it as strongly as possible before critiquing it. This forced me to understand why thoughtful people might hold this view, even if I ultimately disagree.”

This reflection shows several forms of understanding that resist AI replication. Following Freire’s problem-posing approach, the student demonstrates how genuine learning involves confronting one’s own assumptions and engaging in critical self-examination. They show emotional honesty about their learning process, including frustration and breakthroughs. Most importantly, they show intellectual growth through strategic engagement with challenging ideas.

The focus on process addresses one of the most persistent challenges in writing instruction: the tendency for students to view revision as mere proofreading rather than substantive rethinking. When the entire journey becomes gradable, students cannot simply fix comma splices and call it revision. They must show a genuine evolution in their thinking. This pedagogical shift has benefits that extend far beyond preventing AI substitution. Students develop stronger writing skills when they understand revision as intellectual work rather than mechanical correction.

Implementing process-based assessment does require significant changes in course design and instructor workload. However, this increased investment yields multiple returns. Instructors gain much richer data about student learning, allowing for more targeted intervention when students struggle. And the distributed nature of process assessment also reduces the stakes of any single deadline, potentially decreasing student anxiety and the temptation to seek shortcuts.

This approach fundamentally reframes the relationship between students and their own work. Rather than viewing writing as a performance to be optimized for evaluation, students begin to understand it as a tool for discovering and developing their own thinking. Building on Bakhtin’s dialogic principle, the process becomes intrinsically valuable as students engage in dialogue with their own developing ideas. When students value the journey of learning for its own sake, the option to skip that journey by using AI becomes less appealing, not because it is forbidden but because it would mean missing the point entirely.

The Performance of Understanding

The written word has dominated academic assessment for so long that we often forget it represents only one mode through which humans can demonstrate understanding. In our current moment, when machines can generate flawless prose instantaneously, the limitations of text-based assessment become particularly acute. The solution lies not in abandoning writing but in expanding our conception of what makes up legitimate academic evidence. Video assessment offers a powerful method for capturing dimensions of understanding that text alone cannot convey.

Keep reading with a 7-day free trial

Subscribe to The Augmented Educator to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Michael G Wagner
Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture