Search
  • Shea Swauger

F*cking Hammers

“A hammer can either build a house or tear it down.” I’ve heard this idiom, or some derivative of it many times when people talk about technology. The spirit of the phrase is meant to imply that technology isn’t inherently good or bad, it’s how you use it. It’s also wrong. Technological neutrality, and its cousin, data objectivity, are core concepts to the tech industry. The first concept claims that technology is a tool that, while perhaps has technical heritage, is unfettered by cultural context and devoid of bias. The second concept claims that the best way to legitimate something is to have data to back it up as though data were a metaphysical window into reality. Again, wrong.


Technology is a product and reification of the cultural and social forces which birthed it. This means that when a software company builds a product, that product is a reflection of the design team who coded it, the focus group they hired, the company who hired them, the educational system they attended, the venture capitalist who invested in the company, the demographics of the city the employees work in, who runs the local city council, and on and on and outwards until we have the full scope of humanity and history. Technology is enmeshed in all of it.


This is important because when we introduce technologies into education they come with baggage. Being aware of the implicit motivations and pedagogical commitments of a technology can change how or if we decide to use it when we teach. Take the Learning Management System (LMS). CU Denver, one of the universities I work for, uses Canvas, which is owned by a company called Instructure, which was recently sold to a private equity firm called Thoma Bravo. Here's a synopsis of the sale and why it matters. The bad part of this is that Canvas collects data about students and teachers all the time. Our data becomes their property and, added up together, it becomes extremely valuable. Think about every time you've logged in to an LMS, what time it was, where were you, how long were you on, what you did, how long you spent on each page, the order of pages you clicked on. This data is gold to software developers who make platforms for ed tech markets. Essentially, our data is being sold without our informed consent and we are definitely not going to be getting a cut of the sale. I hesitated even putting up my profile picture in it because that now belongs to Thoma Bravo, who has made little commitment to privacy or protecting student data. Here's a letter written to the company that outlines these concerns more in depth. Canvas is no better or worse than its competitors such as Blackboard or Desire2Learn, but that doesn’t mean we should rest easy. The revenue incentive in for-profit companies in a capitalist economic system make it all but inevitable that the first priority for ed tech companies is monetization. When things like privacy, consent, and equity get it the way of monetization, they will lose.


Another common ed tech being deployed in online classes is remote proctoring. These technologies are meant to deter and catch people who cheat on tests by embedding tools like AI, machine learning, and biometric analysis. Unfortunately, these tools have a long history of discriminating against students who are Black, LGBTQ, and disabled. Regardless, universities have thought that is more important to keep the reputation of the school rigorous, the quality of the degree pure, or whatever justification they offer for their militaristic approach to academic integrity than it is to ensure that the university never discriminates against its most vulnerable students. Remote proctoring companies capitalize on education’s proclivity to distrust students, especially ones who have been historically excluded from it. By adopting remote proctoring in our classes, we’re buying into the idea that students are untrustworthy and that surveillance, even at the cost of privacy and discrimination, is the best way to address it. By using an LMS, or TurnItIn for that matter, we’re buying into the idea that a student’s data is infinitely exploitable.


Here’s the upside. Technology can also manifest and reinforce values we think are good. If we believe that education is the practice of freedom, we can build technologies that manifest that. If we believe that education should be centered on agency, liberation, and love, we can build technologies that manifest those too. In the meantime, we should try and interrogate the implicit values in technologies currently offered to us before we use them in the classroom. Admittedly, this is time consuming and often confusing. God forbid we be reduced to reading the Terms of Service, but sometimes I’ve had too. The best advice I can give you is to listen to people who dedicate their expertise and energy to this effort. They will be the ones that tell you how to avoid some of the current landmines. Then find people who talk about teaching in a way that connects with your core values and inspires you. They will be the ones that help you imagine something better and help you get there.


Forget the technology. In general, it’s almost always best to use as little as possible. Connect with people. Center care. The rest will grow from that.

75 views

Recent Posts

See All

The Calculus of Harm

TL;DR We can estimate that 3,588 students of color have experienced racial discrimination by Proctorio 31,397 times in the last year at CU Boulder and CU Denver. If you read on, I’ll tell you how I go

What They'll Remember

I’ve had so many conversations with parents recently who are worried about their kids’ education and not being sure what to do when it comes to online or in-person school this fall. Many of them are w

©2020 by Shea Swauger