|
Chapter One: A Seductive Thought There is a sentiment circulating in the staff rooms and Substack threads of the educational world, a truth universally acknowledged by everyone except, perhaps, the procurement departments. It is a quiet resistance (though we are getting louder), often whispered over lukewarm coffee or typed furiously into WhatsApp groups at the end of a long term. It is the observation that “there isn’t a single problem ‘solved’ by EdTech that couldn’t be fixed with smaller classes led by well-paid teachers given real academic freedom.” It is a seductive thought. It asserts a world where the solution to student engagement should not be a gamified app flashing with dopamine-inducing badges, but a teacher with the time to look a child in the eye and notice they are fading. It suggests that the answer to crushing marking workloads isn't an AI grading bot that scans for keywords, but a timetable that allows a human being to read an essay with a cup of tea in hand, specifically not at 11 PM on a Sunday night. It imagines a system in which the "user interface" is a conversation and the "operating system" is trust. Reviewing the programme for the recent TechAbility Conference, and speaking with the attendees in the margins of the event, I found myself viewing this tension through a distinctly literary lens. From here, we can shift from debating budgets or software licenses to turn, instead, to reenacting the central conflict of Jane Austen’s autumnal masterpiece, Persuasion. (Insight into how my brain works.) For those who have left their classics on the shelf, Persuasion is a story of second chances, lost bloom, and the danger of listening to the wrong kind of advice. In the novel, our heroine, Anne Elliot, is persuaded by her well-meaning mentor, Lady Russell, to reject Captain Wentworth. (Yes, yes, he does wear very tight trousers.) The match is deemed "imprudent." (Not just because of those trousers.) Wentworth has no fortune, no connections, and an uncertain future. He offers only love, vitality, and a meeting of minds. Instead, years later, Anne is pushed toward the slick, socially advantageous Mr Elliot—a man who says all the right things, possesses all the right data points, and holds the keys to the estate, but is ultimately hollow. Today, the Education Sector is Anne Elliot. We are a profession that feels it has lost its "bloom," worn down by years of austerity and metric-chasing. And we are constantly being persuaded by our own Lady Russells—the policymakers, the consultants, the efficiency experts—that investing in the "Wentworths" is simply impossible. To hire enough teachers to reduce class sizes to fifteen? To pay them a wage that reflects their expertise? To give them the autonomy to deviate from the curriculum when a student’s eyes light up? Imprudent.! Too expensive. Too risky. It lacks "scale." It cannot be plotted easily on a dashboard. It is a romantic notion, we are told, incompatible with the hard realities of the modern economy. Instead, we are courted by our estranged cousins, the Mr Elliots of the world. Enter the shiny EdTech platforms, the Large Language Models, the predictive analytics suites. Like Mr Elliot, they are smooth, modern, and presentable. They promise to secure the estate's future. They promise "efficiency" and "personalisation at scale." They whisper that they can take the burden off our shoulders, automate the drudgery, and leave us free to be "facilitators." Imagine evenings and weekends, free! Oh, I must fan myself to calm such a happy countenance. But, like Mr Elliot, this technological courtship often masks a cold, transactional void. We are being asked to trade the messy, expensive, unscalable vitality of human connection—the Captain Wentworth of it all—for a sleek system of inputs and outputs. We are building digital infrastructures that mimic the form of education without its soul. We are creating a "future-proofed estate" where the lights are on, the data is streaming, but no one is actually home. The tragedy of Anne Elliot was that she allowed herself to be persuaded that prudence was a virtue, only to spend eight years in a state of regret, watching her life shrink into a small, silent room. The risk for us, as we stand on the precipice of the AI revolution in schools, is that we do the same. We risk allowing the logic of the machine to persuade us that the human element is a luxury we can no longer afford. Yet, as I looked more deeply into the TechAbility conference speakers and spoke with participants, I realised the story is not quite as binary as "Tech vs. Human." Sometimes, Mr Elliot is a villain, but sometimes, technology is the carriage that brings Wentworth back to us. The question is not whether we use the machine, but who is holding the reins. Chapter Two: The "Cyborg" in the Classroom The friction between human connection and technological intervention was palpable in Richard Fletcher’s keynote, “Exploring Hybrid Help”. The title alone suggests the unease of our current moment. We are not simply using tools; we are drifting into a "hybrid" state where the boundary between personal aid and technological interference is becoming dangerously blurred. If EdTech is merely a way to manage the symptoms of an underfunded system—using GenAI to "personalise" learning because there are 35 children in the room—then the opening observation holds true. A smaller class would fix that. A teacher with time is the best personalisation engine ever invented. When we replace that human interaction with an algorithm, we risk what Fletcher alludes to as the loss of the "human loop." We are building systems that mimic the formof education—Mr. Elliot, in his fine coat, without the soul of understanding. The Rise of the Tryborg Fletcher drew our attention to a critical distinction in the cyborg identity, referencing Jillian Weise’s concept of the "Tryborg". The "Tryborg" is the nondisabled person who adopts technology for efficiency, for fun, or for profit. They choose to extend themselves. They are the students using ChatGPT to write an essay in seconds; they are the administrators using AI to generate policy documents that nobody will read. These "Tryborgs" are not true cyborgs. They do not depend on the machine to "breathe, stay alive, talk, walk, or hear". For them, the technology is a shortcut, a way to bypass the cognitive struggle of learning. And this is where the danger lies. The Closed Loop of Non-Cognition We are currently constructing a closed loop of non-cognition. Fletcher highlighted the emerging risks of "cognitive debt" and the erosion of critical thinking. Consider the bleak absurdity of the modern classroom: a student uses an AI to generate an essay they haven’t written, and a teacher uses an AI to grade an essay they haven’t read. You do not need to persuade me that this is horrific for learning and humanity. The machine talks to the machine. The student gets a grade; the teacher gets a completed spreadsheet. It is a perfect, frictionless system. It is also a complete farce. This is the "Mr Elliot" of education: polite, polished, socially acceptable, and entirely hollow. As Fletcher noted, GenAI is "constitutively irresponsible"—it produces knowledge claims with no author to answer for them. When we invite this into the classroom, not as a tool but as a tutor, we are teaching our children that the appearance of competence is more valuable than the messy, difficult work of actual competence. The Cost of Loneliness But the cost is not just intellectual; it is deeply social. Fletcher warned of the "cost of loneliness" when artificial intelligence substitutes for human interaction. Education is not just the transmission of facts; it is the "non-coercive rearranging of desire". It is a relational act. When we place a chatbot between the learner and the teacher, we sever that relationship. We create a "panopticon" (thank you, Foucault) of surveillance in which every keystroke is tracked, yet no one is truly watching. We risk creating a generation of students who are technically connected but profoundly alone, interacting with "sycophantic" bots that validate their errors rather than challenge their thinking. In Persuasion, Anne Elliot is surrounded by people yet entirely alone in her understanding of the world. She sits in the drawing room, listening to the noise of the Musgroves and the smooth flattery of Mr Elliot, but her mind is elsewhere. We are building digital classrooms that replicate this isolation. We are filling the silence with the chatter of algorithms, mistaking data for connection. We must ask ourselves: are we using technology to bring us closer to the "Wentworths"—the authentic, challenging, human encounters—or are we using it to build a more efficient, automated solitude? Chapter Three: The Exception When Tech is Voice, Not Just Efficiency However, to embrace the "smaller classes" argument entirely is to miss a crucial nuance—one that requires us to step out of the comfortable, wainscoted warmth of the Austenian drawing room and into the bracing reality of complex disability. If we remain solely in the debate about efficiency, we risk ignoring those for whom "efficiency" is irrelevant because access is the primary battle. There are problems that smaller classes alone cannot solve. There are silences that even the most patient, well-paid, and autonomous teacher cannot break without a machine to help them listen. In Persuasion, the horror of Anne Elliot’s life is her muted existence; she is present, but unheard. "She was only Anne," the novel tells us. But in the modern classroom, some students face a silence far deeper than social exclusion. Oh, cutting. The Command of the Gaze Take Harchie Sagoo, whose keynote address, “I Lead, You Follow,” challenged the very premise of who is in charge of the educational narrative. Harchie has Cerebral Palsy. In a traditional setting, without technology, he might be viewed through a lens of passivity—a student to be "cared for," to be "managed." Yet Harchie uses a GridPad 13 with eye-gaze technology. For Harchie, a smaller class led by a well-paid teacher is wonderful, but it does not give him a voice. The technology does. In his presentation, Harchie described how his setup allows him not just to complete schoolwork but to exert agency over his world. He uses his eyes to answer the Ring doorbell to scare the postman. He uses it to turn off the shower when his father is midway through washing. These are not "learning outcomes"; they are acts of glorious, mischievous rebellion. They are the proofs of a personality imprinting itself on the world. From here, EdTech is not about "efficiency"—it is not Mr Elliot trying to streamline the estate. This is EdTech as liberation. It transforms the user from a passive recipient of care into a leader who can, quite literally, tell the world to "follow." The Voice from the Silence If Harchie represents the power of the visible gaze, Dr Rosie Woods took us into the realm of the invisible. Her session, “Giving a voice to those who cannot speak,” highlighted the frontier of sub-vocal speech recognition for people with Profound and Multiple Learning Disabilities (PMLD). Dr Woods challenged the assumption that people with PMLD are "pre-linguistic" simply because they cannot articulate sounds. She introduced us to the concept of "sub-vocal speech"—the silent, internal speech that occurs in the brain and muscles even when no sound is produced. Using specialised microphones and software, her team recorded and amplified this internal voice. The results were striking. One participant, Lizzie, who was previously unable to communicate clearly, was recorded saying: "I can’t write… but I can talk. I know what’s planned, I feel safe". Pause on that for a moment. “I know what’s planned.” No amount of academic freedom, no reduction in class size, and no amount of teacherly intuition can decode sub-vocal speech without the hardware. Without the tech, Lizzie is trapped in a room with no doors. With the tech, the door opens. Here, the technology is not a replacement for the human; it is the bridge to the human. It is the only thing that allows the "well-paid teacher" to actually do their job: to listen. The Access Paradox This brings us to the Access Paradox. The critique of EdTech in Chapter One stands firm on the neurotypical, mainstream experience: we do not need AI to grade essays or generate lesson plans that a human should craft. That is "lazy" tech. But for Harchie and Lizzie, technology is the "Wentworth" factor. It is the vessel of their vitality. It is the tool that allows them to reclaim their "bloom." To dismiss all EdTech as a neoliberal ploy to replace teachers is to inadvertently condemn these students to silence. We must distinguish between the technology that automates the human experience (bad) and that which enables it (essential). The former is a cage; the latter is a key. Chapter Four: The Synthesis Tech Needs the "Wentworth" Factor So, where does that leave our original provocation? If we accept that technology is essential for access (as Harchie and Lizzie demonstrated), does that mean we must submit to the hollow efficiency of the "Mr. Elliots"? Must we accept the premise that machines should replace the expensive, messy work of human teaching? Not at all. In fact, the evidence from the conference suggests that the original blog prompt was half-right. Technology does not solve problems in a vacuum. It fails spectacularly and expensively when treated as a replacement for human expertise rather than as a tool that requires more of it. The answer lies in the Kingspark School case study presented by Paula Kane and Eimer Galloway. Their journey offers a blueprint for what happens when you stop buying "solutions" and start investing in souls. The Investment in Character Kingspark faced a familiar dilemma. They had the technology—the DriveDecks, the switches, the hardware—but it wasn't being used effectively. The "Mr Elliot" of the situation (the shiny equipment) was present, but the relationship was cold. Why? Because the staff lacked confidence. They were paralysed not by a lack of desire, but by a lack of support. Their solution was not to buy more software. It was to invest in the "Wentworth" factor—human competence, constancy, and autonomy. They secured funding not for gadgets, but for a person—specifically, an Assistive Technology Team Leader. They understood that technology is inert without a champion. They established a "Community of Practice", a dedicated space for staff to share knowledge, mirroring the camaraderie of Wentworth’s naval officers rather than the isolated competition of the Elliot family. Crucially, they listened to their staff who demanded "interactive and functional training that takes place in directed time". They realised that you cannot learn to wield these powerful tools in the margins of a frantic day. They ring-fenced time. They prioritised "hands-on" experience. They proved that for technology to work, schools need exactly what the original blog prompt demanded: time, autonomy, and specialised roles. Look, what flexibility and time can do! The Map and the Territory This necessity for human rigour is reinforced by the systemic work of Rohan Slaughter and Tom Griffiths in their presentation, “Developing an AT Competency Framework”. If Kingspark provided the narrative, Slaughter and Griffiths provided the map. They argue that we cannot simply drop tools into a classroom and expect miracles. That is the "Mr Elliot" approach—all surface, no substance. Instead, we need a "training ecosystem". Their framework breaks down the necessary human skills into four distinct phases: Assessment, Provisioning, Ongoing Support, and Review. Note that the technology itself is only a fraction of this cycle. The rest is human judgment, human observation, and human adaptability. They highlight that "AT is not the prevail of one particular job role – everyone has a role". This dismantles the idea of the "plug-and-play" solution. It suggests that true technological integration requires a "Captain Wentworth" level of discipline and skill. It requires a professional class who are not merely "users" of a system, but masters of it. The Piano and the Pianist The synthesis of these arguments brings us to a singular truth. The technology did not "solve" the problem at Kingspark in isolation. The technology was merely an instrument, like a fine piano sitting in a drawing room. It required a pianist with the training, the time, and the passion to practice. When we view EdTech through this lens, the conflict between "tech" and "teachers" dissolves. We do not need fewer teachers; we need more teachers, and we need them to be more highly skilled than ever before. We need them to be the "Wentworths" who can navigate the complexities of sub-vocal recognition and eye-gaze calibration with the same confidence that they navigate a curriculum. The danger is not the technology itself. The danger is the "persuasion" that the technology allows us to be cheap. The danger is believing Mr Elliot when he says we can fire the pianist because the piano can play itself. Chapter Five: The Second Spring At the very end of Persuasion, Anne Elliot is granted what the narrator calls a "second spring" of youth and beauty. Crucially, this renewal does not come because she has acquired a new accessory, or a better carriage, or a more efficient way to manage her household accounts. It comes because she has reclaimed her connection to Captain Wentworth. She has chosen the difficult, vibrant, human path over the safe, calculated hollowness of Mr Elliot. (It’s the trousers.) The lesson for us, as we navigate the noisy marketplace of modern education, is that EdTech and "Human Tech" (teachers) are not binary opposites, though they are often sold as such. We are constantly subject to the same "persuasion" that plagued Anne. We are persuaded to buy the software because it is cheaper than hiring a teaching assistant. We are sold the chatbot because it is easier than reducing the caseload. We are told that if we just adopt the right platform, the structural cracks in the walls will cease to matter. The Inertia of the Machine But the evidence from TechAbility 2025 shatters this illusion. It proves that the most powerful technology is utterly inert without the warmth of human expertise to animate it. Consider the work of Dean Hall at Treloar’s. His session on 3D printing was not a paean to the printer itself—a machine of plastic and heat. The "miracle" was not that the machine could print a joystick knob; the miracle was that Dean, with his engineering background and human empathy, could design a bespoke "magnet assessment knob set" to allow a specific child to drive their own wheelchair. The printer is just a tool; Dean is the architect of access. Consider Kirsty McNaught’s work on block-based coding. The software existed, but it was full of barriers—drag-and-drop interfaces that locked out eye-gaze users. It took a human expert to dismantle those barriers, creating a "keyboard accessible" bridge so that a physical disability does not preclude a digital education. And consider Harchie Sagoo and Dr Rosie Woods. The technology—the GridPad, the sub-vocal sensors—was the vessel. But the cargo was the human personality. The technology did not replace the need for connection; it created the possibility of it. As Harchie’s presentation title reminds us, the goal is not for the machine to lead, but for the human to say: "I Lead, You Follow". Holding Out for the Real Thing There isn't a single problem solved by EdTech alone. A 3D printer in a cupboard solves nothing. An eye-gaze camera without a trained therapist is just expensive glass. But there are miracles achieved by EdTech when it is placed in the hands of a teacher who has been given the freedom, the time, and the support to use it. When we invest in the "Wentworths"—the staff, the specialists, the time to care—the technology sings. We must stop letting the Mr Elliots of the tech world persuade us that they can replace the heart of the profession with a dashboard. We need to stop apologising for the cost of human expertise. We need to hold out for the real thing. Only then will education see its second spring. With sincere thanks to the presenters and attendees at TechAbility 2025 for their insights, and particularly to Harchie Sagoo for reminding us that while technology is the tool, independence is the goal. I learned so much. Dedication To you who persuaded me to pick up books again. Thank you for cracking the spine of stories I thought were shelved and for proving that while the machine processes the text, it takes a human to find the subtext. References & Further Reading
The Conference & The Case Studies:
Comments are closed.
|