by The Educator Collaborative fellow, Troy Hicks
Given the manner in which AI tools are advertised to teachers — with at least one company promoting it’s tools as a way to make educators’ “free time really free,” among the other extraordinary claims of many other companies — it is no wonder that teachers are, indeed, using AI in these ways.
Now that we are in our third school year of the ChatGPT era, many educators now find themselves in districts and schools that are encouraging — if not requiring — the use of AI in their instructional practice. While some schools are pushing back against student use of AI because of, according to the Brookings Institution, “possible risks of increased plagiarism and cheating, disinformation and discriminatory bias, and weakened critical thinking,” the vast majority of educators are do say that they are using AI, at least for perfunctory tasks such as generating or modifying assignments and assessments, to provide feedback on student work, and engage in other administrative tasks.
Of course, the opportunity to easily create a chatbot that can serve as a tutor for one’s students, to quickly adapt a worksheet for learners at different reading levels, and to quickly grade dozens of students’ essays does seem appealing. And, no doubt, some of these tasks are well-served to be off-loaded to AI (though, of course, we should, as the Civics of Technology group calls for, continually ask critical questions of these tools, all the while, as Kate Crawford reminds us, the data that they take from us, the water and energy that they use, and the biases that are present in them.
What is troubling, however, is that few, if any, teachers report how they effectively model discipline-specific, content-rich examples of AI integration for their students. In other words, many educators are using AI, but are not necessarily teaching AI.
This is problematic, on both levels. For teachers, uncritically adopting AI, as argued by the English Language Arts Teacher Educators consistent group of the National Council of Teachers of English, is not the correct stance: “Educators must stay vigilant and curious as GenAI platforms evolve.”
And, for our students, as Trevor Aleo in his handbook on AI for the Human Restoration Project contends, “We must recognize like any other literacy that AI is a skill students must know how to use, while taking a critical lens to the implications of AI on our classroom and the world more broadly.”
Many other scholars and educators have been making similar calls, and the distinct need to engage students in AI literacy has been made clear.
Critically Thinking “About” and “With” AI
As I have been suggesting on the Educator Collaborative Blog for a few years (here, here, and here, specifically), literacy educators in particular are well-poised to model for and then coach our students toward “productive, and perhaps even creative, opportunities” with generative AI tools.
While this may be difficult — mostly because many of the tools available to students are ones that are designed as tutors or grammar assistants, not the kinds of AI that we typically imagine like ChatGPT or Gemini — I contend that the cost of not teaching our students how to effectively write with AI could be worse than any ill-effects that possible cheating with AI might cause.
Indeed, we need to teach our students how to think critically about AI, as well as think critically with AI.
To encourage students and help them think critically about AI, numerous existing resources are available. In addition to the links above, there are many freely available resources on Common Sense Media’s AI Literacy Lesson Collection, Teach AI from aiEDU has a fully fleshed out AI curriculum, and embedded in the “Empowering Learners for the Age of AI” framework, are many lesson suggestions clustered around the themes of “engage,” “create,” “manage,” and “design” with AI. In these ways, we can teach students how AI works, what its limitations and capacities actually are, and move them beyond simply interacting with a pre-designed chatbot to complete mundane tasks.
Then, to the more important task of helping students think critically with AI, we need to provide deliberate opportunities — albeit structured with guardrails — so they can learn how to think critically with, as well as write, with AI. And, if you don’t think that they aren’t already writing with AI, then it is time to do some exploration of the many tools available, beyond the typical kinds of AI chatbots that we think of first.
Being and Becoming Writers in the Age of AI
So, if you, as an educator and author, have not yet explored tools like Sudowrite, Rytr, Wordtune, Quillbot Flow, or Lex.page, now is the time to do it. As you do, I encourage you to adopt the stances that Kristen Hawlwy Turner and I outline in our book, Teaching Writing in the Age of AI, where you engage with the tools as thinking partners, research assistants, and, ultimately, co-writers.
- As a thinking partner, you can employ AI tools that will help you identify multiple perspectives, provide warm and cool feedback, and serve as a sounding board while you test out new ideas.
- As a research assistant — even though, as many of the warning notes on AI outputs still suggest — we need to be wary of potential errors in what AI produces, even the free models are now able to do more and more with real-time search of the web, finding relevant sources to explore further and substantiate the claims we are trying to make.
- And, finally, as a co-writer, we are all familiar with “auto-complete,” and yet the AI tools noted above are becoming even more integrated into our writing processes with options to “expand,” “condense,” and “rewrite” our own text, as well as generate new text that we can then revise to our liking.
Each of these stances towards the use of AI positions us in agentive stances as writers in the age of AI. In taking these stances, you can begin to think about how your own writing process(es) are influenced and altered by AI, as well as when it is best to step away from AI and engage in what the West Ed Digital Fluency Group and the National Writing Project describe as “friction by design,” or “The messy, effortful parts of the learning process—struggling with ideas, revising work, grappling with confusion, [and] engaging in dialogue,” completely separate from or in light consultation with AI.
As you work to understand your own writing process in partnership with these AI tools — from brainstorming and drafting, through to revising and publishing — you can then speak knowledgably to your students about the choices you make as a writer, both about the AI tools you use as well as what you do when you write with them. It is only then that we — as teachers of writing who are writers ourselves — can effectively model our own process, enhanced by AI, so our students can learn the complexities and nuances of crafting writing with the many technologies available to them.
AI Disclosure Statement
I acknowledge the use of Google Gemini to help generate ideas for a title (including the subtitle, “Adopting Agentive Stances for Teaching Writing in the Age of AI”) and Google Nano Banana to create the cover image with the prompt: “Generate an image to reflect the themes in this blog post, showing a teacher in their classroom engaged in the process of writing on a laptop computer with the assistance of AI.”
