In April 2020, the Georgia’s State Superintendent, Richard Woods, sent an open letter to school districts, teachers, parents, and students that encouraged them to “choose compassion over compliance,” especially given the unique circumstances each student was facing amidst the unfolding crisis. The letter extolled teaching as heroic, noted the foundational importance of school-community connections, and underscored that grades and standardized tests were a distant second to shared well-being. School leaders across the country echoed similar themes, calling for grace, self-care, and solidarity against COVID-19, as schools leaned in to being sites of community care, distributing meals, connecting families to technology, and, in some cases, providing ongoing mental and physical health services.
Away from the public eye, internal conversations about instruction and assessment in many districts did not match the public messaging about grace, care, and compassion, as tensions about what it meant for students to be “accountable” during the last few months of school took center stage. In my context, this meant hours-long meetings fraught with worry about incomplete content coverage, issuing grades, and monitoring student behavior, even while they were home. By April 2020, NWEA had already issued a report on the so-called “COVID slide,” a learning-loss progenitor, complete with alarming graphs and charts that forecasted deep drop-offs in achievement, and other organizations, like McKinsey, quickly followed.
These reports quickly framed the conversation—they became the unquestioned common sense—about what it meant to teach and learn during COVID. Teaching and learning became about “catching up,” which involved sorting kids into who needed remediation and who needed acceleration, often with the help of those issuing reports, but “catching up”—and “catching up” fast—required more seat time, more content, and, ultimately, more control. Taken together, what it meant to support students, especially in a time of upheaval and uncertainty, became primarily about maintaining and raising traditional metrics of achievement, the very thing school leaders said were secondary just weeks earlier.
While organizations like NWEA and McKinsey were issuing reports, ed tech companies were also sharpening their pitch to schools and districts with programs that promised remediation, acceleration, and, most of all, an ability to track student behavior, even if they were virtual. Slick marketing campaigns tying software to address the “COVID slide” were inevitable and effective, especially as it became clear that software could be used to compel compliance. For instance, seamless integration from assignment turn-in to the always-on gradebook meant instantaneous zeros for missing work, and, in many cases, immediate notification of adults. Ruthless efficiency was critical, especially since we were locked in a race against lost learning.
When Minneapolis Police murdered George Floyd in May 2020, the questions abolitionist students, activists, and organizers have been raising about policing, especially policing in schools, for decades finally received national attention and interest. Conversations about how schools had more police than social workers and counselors were moved to the front burner, and there were more public proclamations from school leaders about the need to reimagine the existing systems and structures, but the technological purchases districts were making told a different story.
Indeed, the pedagogical and instructional conversations in many districts didn’t mirror the public-facing discourse about policing, as less obvious, but no less insidious forms of command and control remained, including many of the surveillance technologies that schools and districts had just purchased to ensure kids were “catching up.” That new learning management system (LMS) could tell you when students logged on, for how long, and what they clicked—or didn’t—while they were there. These were billed as tools to support students, but, for these companies and for many districts, student support really meant figuring out how to ensure students were compliant even when they were in their own homes.
In virtual settings it became less and less clear where the school’s authority stopped, as their surveillance software kept tabs on every click that students were—or weren’t—making, especially if they were using a school issued device. Many of these systems also offered integration of other surveillance tools, like plagiarism detection software, which further extended the reach of the institution’s prying eyes, as schools became increasingly concerned about academic dishonesty during this period, a moral panic that has carried over into conversations about AI.
A particular form of academic dishonesty, cheating on tests, seemed particularly arresting to many schools and districts, including my own at the time. With a slick pitch, we heard about and from virtual proctoring companies that offered a range of options, from locking students into a single web browser to video recording their every move, all in the name of academic integrity and test security. The district I was in chose a program called whose website tells teachers they can “manage class like magic.” The system allowed teachers to monitor the online activity of their students on school-issued devices. This monitoring included seeing screens in real time, full access to browser histories, and the ability to freeze or lock screens.
While these surveillance technologies were billed as keeping kids laser-focused on academic acceleration to “catch up,” their use could never be apolitical. The use of electronic surveillance mirrors the patterns of physical surveillance, meaning that those who were always-already vulnerable, including students of color and disabled students. Infamously, the facial detection features on some video proctoring software struggled to recognize Black and Brown faces and consistently flagged neurodivergent students for not maintaining eye contact with the webcam. Whenever a “suspicious” movement was flagged, a notification would be sent automatically to the instructor, complete with clips and an abnormality rating, almost mirroring the seamless integration between the LMS and the gradebook or its integration with plagiarism detection software. This engages teachers in a kind of “broken windows” policing, where only certain students—those flagged by the algorithms, students of color and disabled students—are being surveilled.
The root causes of surveillance technology—fear, mistrust, and insecurity—also influence how teachers teach. Concerns about academic integrity have meant that students are subjected to more in-person timed writing and, barring that, resorting back to the browser lockdown programs that were popular during virtual learning. Recently, Turnitin announced that they were working on software that could detect whether a paper is AI generated, an announcement that was widely celebrated. The irony of the celebration is the data students were compelled to put into systems like Turnitin are, in part, responsible for the rise of AI. John Warner has repeatedly sounded the call for instructors to revisit their writing pedagogy and practice rather than use surveillance software.
Yet, fear, mistrust, and insecurity means surveillance software is as popular as ever, some software even promises to monitor student files to protect against would-be attackers, even sending information directly to law enforcement, creating even closer ties between schools and policing, ties which we disavowed and promised to reimagine in the summer of 2020. Like video proctoring software, this sort of surveillance software offered many false flags. There was a report of the software flagging profanity in a school’s literary magazine as suspicious, and a story in Wired revealed that school administrators in Texas had access to students’ conversations on private devices because students had plugged them into their school-issued laptops to charge. This becomes especially troubling as bans on abortion and gender-affirming care become the law in many states across the country, meaning that students might be outed, even if their searches occur on private devices. The same Wired story pointed to a Boston Globe story that revealed school surveillance software exposed the records of undocumented students to law enforcement, exposing them to a greater risk of deportation.
There are many stories that we can tell ourselves about ourselves when it comes to surveillance, stories about averting “learning loss,” maintaining academic integrity, or hardening schools against the next attacker, but, in each case, fixing “flawed” individuals is centered over fixing unjust and unresponsive systems. These stories also buffer teachers, administrators, and others in schools from reflecting on their acceptance of surveillance. The websites for many of the companies feature enthusiastic quotes from teachers and administrators in districts, and, even anecdotally, most people I’ve interacted with see these technologies are generally accepted as benign or even helpful.
Audrey Watters visited one of my virtual classes in 2020, and she spoke about how the popularity of these technologies is derived from their perceived ability to bring order to the disordered chaos of teaching, to make the metrics and parameters of a complex enterprise legible. It also outsources these functions to algorithms, leaving discipline to the machines. It’s not what you can do for surveillance technology, it’s what surveillance technology can do for you. Watters said this:
Surveillance in schools reflects the values that schools have (unfortunately) prioritized: control, compulsion, distrust, efficiency. Surveillance is necessary, or so we’ve been told, because students cheat, because students lie, because students fight, because students disobey, because students struggle.
Sounds like what Jeffrey Moro calls “cop shit” to me.
In the Summer of 2020, about six weeks after George Floyd’s murder, during an online lecture, Fred Moten talked about the ways in which abolition forces us to make disquieting links between institutions and ideas we want to believe are separate, like school and surveillance, especially in as much as its tied to law enforcement. Those disquieting links, hopefully, cause us to “do shit differently right now” in the interest of a better present and future. Moten talks about how movements away from policing are incomplete if they don’t consider how administrators and faculty also need to relinquish the policing present in their roles. Put differently, moving away from policing requires those of us in and around schools to “do shit differently right now.”
What might it mean to “do shit differently right now” and stand with our students against surveillance in practical ways:
- Lean into Jesse Stommel’s invitation to start by trusting students and directly work against fear, insecurity, and mistrust. In other words, students are not adversaries, and we can’t start with that assumption. Moro calls philosophies and structures that pit teachers and students against one another as “cop shit,” and starting with trust is a good first step to avoiding “cop shit.”
- In addition to trusting students and avoiding “cop shit,” we must actively resist the use of surveillance technology in our classrooms. Standing in solidarity with our students, especially those at the margins, means not requiring their engagement with any of these technologies. It’s also important to advocate against the purchase and use of these technologies to decision makers in schools and districts. Cathy Flesicher and Antero Garcia have a useful framework to help teachers be “everyday advocates” for causes that matter to them.
- We can heed John Warner’s call to reimagine our pedagogies and practices to be more generative, responsive, and liberatory in resistance to the rise of Skinner-inspired technology. As he reminds us, we do not do our best work when we are under surveillance, especially constant surveillance from someone—or something, like an algorithm, that hold power over us.
There are probably a thousand more ideas, but as Audrey Watters said to my students, it’s time to disentangle ourselves from fear and mistrust that makes surveillance technology an attractive option and develop a cohesive vision for a more trusting, less algorithmic education system. It’s time to stand in solidarity with students against surveillance for a better present and future.