ENG 548 — Seminar in Critical Cultural Theory · Washington State University · Spring 2026 · Submitted for Faculty Research Grant Consideration
AI. SCAN 01/04 · TARGET ACQUIRED Biopolitics. SCAN 02/04 · CLASSIFIED Engineered SCAN 03/04 · PROCESSING Oppression. SCAN 04/04 · AUTHORIZED
Bibhushana Poudyal
bibhushanapoudyal.com
Washington State University
SEMESTER
Spring 2026
ENG 548 · Seminar in Critical Cultural Theory
SYSTEMS UNDER EXAMINATION: COMPAS (Northpointe)  ·  Palantir Gotham  ·  NSA SKYNET  ·  Clearview AI  ·  Amazon Rekognition  ·  ShotSpotter  ·  IDF Lavender  ·  PredPol  ·  HART (Rite Aid)  ·  ICE/CBP Biometrics  ·  ATLAS (HSI)  ·  SurveillanceCamera Aware  ·  Facebook's Photo DNA  ·  Google AutoML  ·  Microsoft Azure FRT  ·  IBM Watson Health  ·  SYSTEMS UNDER EXAMINATION: COMPAS (Northpointe)  ·  Palantir Gotham  ·  NSA SKYNET  ·  Clearview AI  ·  Amazon Rekognition  ·  ShotSpotter  ·  IDF Lavender  ·  PredPol  ·  HART (Rite Aid)  ·  ICE/CBP Biometrics  ·  ATLAS (HSI)  · 
↓ scroll to read ↓
02

Course Description

This course is a radical excavation of how artificial intelligence functions as a weapon of empire, capital, and carceral power. We examine the violent entanglements of AI with biopolitical control and engineered systems of domination — systems that surveil, categorize, and criminalize the global majority under the false banners of innovation, progress, and neutrality. Grounded in decolonial, Black feminist, Indigenous, queer, and abolitionist thought, this course exposes AI not as an impartial tool, but as a racialized and militarized infrastructure built to extend the legacies of settler colonialism, slavery, and eugenics.

Through frameworks of necropolitics, racial capitalism, and surveillance abolition, we will interrogate how predictive policing, biometric surveillance, algorithmic governance, and data extraction are deployed to manage, punish, and disappear marginalized communities. We will analyze how AI consolidates power within the prison-industrial and military-industrial complexes—targeting Black, Indigenous, queer, disabled, undocumented, and impoverished bodies while presenting itself as objective and efficient.

Rather than framing these harms as accidental, we treat them as the logical outcomes of technologies designed within—and for—systems of racial and imperial domination. This course is not neutral. It is a call to dismantle oppressive AI systems and to build liberatory, collective, and just technological futures.

The success of this course depends entirely on our shared commitment to collective un/re/learning, radical solidarity, and a refusal to settle for anything less than total liberation.
Technical Systems We Will Dissect — Documented Cases
COMPAS
Correctional Offender Management Profiling (Northpointe/equivant). Black defendants scored High Risk at 2× the rate of white defendants with identical criminal histories. Algorithm is proprietary — courts cannot inspect its weights. See: ProPublica Machine Bias (2016).▸ expand context
The Correctional Offender Management Profiling for Alternative Sanctions algorithm was built by Equivant (formerly Northpointe). ProPublica analyzed 7,214 defendants in Broward County, FL. The recidivism prediction is used in pre-trial, sentencing, and parole decisions across 30+ states. The algorithm's weights are proprietary — defendants have no legal right to challenge its logic.
Palantir Gotham
Data fusion platform used by ICE, FBI, LAPD, CBP, US Army. $2.4B in government contracts (2024). Fuses hundreds of databases — arrest records, social media, license plates, utility records — into unified surveillance profiles on millions of people.▸ expand context
Founded by Peter Thiel with In-Q-Tel (CIA venture capital) funding. Palantir's Gotham platform is deployed at 23 of the 25 largest US police departments, ICE, CBP, US Army. It fuses 100+ data sources: DMV records, utility bills, arrest records, social media, license plate readers, etc. Each person generates a "pattern of life" profile updated in real time.
NSA SKYNET
Machine learning program in Pakistan and Somalia identifying suspected terrorists from phone metadata patterns. Misidentification rate documented in leaked documents. Estimated hundreds of civilian deaths from algorithm-selected targets. Source: The Intercept, Drone Papers (2015).▸ expand context
SKYNET used metadata from 55 million Pakistani cell phone users to build behavioral models of "terrorist" communication patterns. The algorithm flagged travel routes, calling patterns, and social connections. The Intercept's analysis of leaked NSA documents showed a 0.008% precision rate — meaning roughly 99.99% of targets could be innocent civilians.
Clearview AI
Scraped 30+ billion faces from social media without consent. Sold to 3,100+ law enforcement agencies. Banned in EU, Canada, Australia, UK. Still operating in the US. CEO's legal argument: the First Amendment as a surveillance right.▸ expand context
CEO Hoan Ton-That argues that the First Amendment protects the right to scrape and analyze any publicly posted image. Clearview has been ordered to stop operating in the EU (GDPR), UK (ICO), Canada, and Australia. In the US: no comprehensive federal biometric law exists. The ACLU sued Clearview in Illinois under BIPA (Biometric Information Privacy Act), winning a settlement in 2022.
Amazon Rekognition
Facial recognition API with documented 28–34.7% error rate on dark-skinned women (Buolamwini & Gebru, 2018). Robert Julian-Borchak Williams: first documented wrongful arrest from FRT misidentification, Michigan, 2020.▸ expand context
Joy Buolamwini's Gender Shades study (MIT, 2018) tested three commercial FRT systems. Amazon's Rekognition showed 0% error on light-skinned men, 34.7% error on dark-skinned women. Robert Julian-Borchak Williams was handcuffed in front of his daughters after a Michigan police officer ran a grainy surveillance photo through Rekognition. The match was wrong.
ShotSpotter
Acoustic gunshot detection deployed at 89% concentration in Black neighborhoods. Generated 40,000+ false alerts in Chicago (2019–2021) with no evidence of crime. Secret contracts — residents not notified. Evidence used to obtain warrants.▸ expand context
Vice Media obtained ShotSpotter internal documents through FOIA. The company employs human reviewers who reclassify acoustic detections after the fact — changing "firecracker" to "gunshot." In Chicago, 89% of ShotSpotter alerts led to no evidence of crime. The company's contracts with cities are confidential — residents in surveilled areas were never informed.
IDF Lavender
Israeli AI targeting system that auto-selected 37,000+ targets in Gaza with minimal human review per strike authorization. Reported 15–20 civilian deaths per "junior militant" target considered acceptable. Source: +972 Magazine, The Guardian (2024).▸ expand context
+972 Magazine published Israeli intelligence sources describing a system that generated a target list of 37,000 people based on association networks. A parallel system called "Where's Daddy?" tracked when targets entered their family homes — the preferred strike time. Human review per strike authorization: approximately 20 seconds.
03

Data Visualizations

These charts are built from real, citable data — run the analysis yourself using the GitHub links below. Hover over chart elements for details.

COMPAS Racial Bias — Broward County, FL (n = 7,214 defendants)
Source: ProPublica Machine Bias (2016) · github.com/propublica/compas-analysis · DOI data available
Big Tech Federal Contracts by Agency — Amazon, Google, Microsoft (DoD, DHS, DoJ, State + Other)
Source: Digital Destroyers — bigtechsellswar.com · Hover bars for contract details
NSA SKYNET / Drone Kill Chain — Algorithmic Steps from Data to Strike
Source: The Intercept — The Drone Papers (2015) · theintercept.com/drone-papers/ · Click each step for detail
Facial Recognition Error Rates by Skin Tone & Gender — Gender Shades Study
Source: Buolamwini & Gebru (2018) · proceedings.mlr.press/v81/buolamwini18a.html · Microsoft, IBM, Face++ systems tested
04

Textbook & Essential Readings

Necropolitics
Mbembe, Achille. (2019). Duke University Press. — PRIMARY TEXT

The foundational framework for this course. Mbembe's necropolitics — the sovereign power to decide who may live and who must die — is applied directly to every AI system examined. Read the original 2003 essay first: DOI 10.2307/3250559 (JSTOR open access).

*All other required materials provided via PDF or Canvas links.

Additionally Essential — Technically Annotated
The Age of Surveillance Capitalism — Zuboff, S. (2019). PublicAffairs. → Technical anatomy of data extraction as an economic model. Why surveillance is profitable, not just harmful.
Race After Technology: Abolitionist Tools for the New Jim Code — Benjamin, R. (2019). Polity. → The "New Jim Code." Racial hierarchy repackaged in algorithmic language. Read alongside Noble.
Weapons of Math Destruction — O'Neil, C. (2016). Crown. → Mathematical proof, written accessibly, of how "neutral" models produce targeted harm. Pairs with ProPublica COMPAS analysis.
Dark Matters: On the Surveillance of Blackness — Browne, S. (2015). Duke UP. → Surveillance as anti-Black technology from slave ships to biometrics. The historical continuity argument: this has always been the system.
Algorithms of Oppression — Noble, S.U. (2018). NYU Press. → White supremacy in Google's ranking systems. Note: every LLM trained on internet text inherits this problem. Critique the AI tools you use daily.
05

Trigger Warning

TRIGGER WARNING

During this course, we will address topics that compel us to discuss some of the most violent atrocities humans have inflicted upon other humans and planet. That includes:

06

Projects, Points, Grading Schema

AssignmentsDeliverablesPointsDue
Low StakesReading Response100Wednesday midnight every week
Low StakesScaffolding Assignments100Wednesday midnight every week
Low StakesMid-Semester Research Presentation100Each student once a semester
1st Major / Low StakesProject Proposal50Check course calendar
2nd Major AssignmentTopic Analysis100Check course calendar
3rd Major AssignmentFinal Project & Class Presentation150Check course calendar
Total Points500
A93.4–100
A−90–93.3
B+86.8–89.9
B83.4–86.7
B−80–83.3
C+76.8–79.9
C73.4–76.7
C−70–73.3
D+66.6–69.9
D63.4–66.7
D−60–63.3
F0–59.9

THE SYLLABUS IS SUBJECT TO CHANGE. · Ada Omvlak/Chronza. Pologne. Collectif Kefija.

06b

Major Assignments — Full Details

Assignment 01

Radical Topic Proposal

50 Points 2–3 pages · Single-spaced Due: Feb 11 Times New Roman 12pt · MLA or APA · Submit via Canvas
Purpose

This assignment invites you to choose a site of struggle where AI, data, or technology functions as a tool of domination. Your proposal will frame a semester-long investigation into how technological systems enact violence, reinforce oppression, and sustain racial capitalism, colonialism, and carceral power.

This is not a neutral exercise. It is an act of refusal and reimagining. Your topic will reflect a commitment to abolitionist, decolonial, and anti-oppressive thought, preparing you for deeper analysis in Assignments 2 and 3.

What to Include
Site of Struggle

Select a specific system, technology, or practice where AI or data-driven tools are used to surveil, classify, control, or harm. Examples (but not limited to):

  • Predictive policing or risk-assessment algorithms
  • Biometric border surveillance or immigration databases
  • AI-driven hiring, healthcare, or social service allocation
  • Facial recognition in schools, public housing, or protests
  • Data extraction in the Global South or on Indigenous land
  • Militarized AI (drones, automated warfare, occupation tech)
Context & Harm

Briefly explain:

  • Who built it? Who benefits?
  • Who is targeted, disappeared, or made disposable?
  • What historical systems does it continue? (e.g., slavery, colonialism, eugenics)
Why It Matters

Articulate the political, ethical, and existential stakes. Connect your topic to course themes:

  • Racial capitalism
  • Necropolitics / biopolitics
  • Settler colonialism / empire
  • Abolitionist futures
Guiding Questions

Pose 2–3 critical questions to guide your inquiry. These should be analytical and politically engaged. Examples:

  • How does this technology reproduce anti-Blackness or colonial violence?
  • What would it mean to abolish this system?
  • How do communities resist, subvert, or imagine beyond it?
Initial Approach

Describe what materials or methods you might use to investigate this topic. These could include:

  • Case studies, policy documents, tech whitepapers
  • Activist media, protest art, social media campaigns
  • Scholarly texts, community reports, oral histories
  • Methods: discourse analysis, visual/rhetorical analysis, speculative design
Before Submitting — What to Look For
Criterion
What We're Looking For
Clarity & Specificity
Is the topic clearly defined and focused? Does it identify a concrete system, technology, or site of struggle? Is it actionable for in-depth rhetorical and political analysis?
Political Urgency
Does the topic confront systems of harm, domination, or violence? Does it address the stakes for marginalized communities and reflect an understanding of power, oppression, and resistance?
Theoretical Grounding
Is the topic informed by—and does it engage with—decolonial, abolitionist, Black feminist, Indigenous, queer, or other radical frameworks? Does it move beyond liberal or technical critiques?
Feasibility
Can the topic support deep, sustained investigation across 10–12 pages? Are relevant materials (texts, artifacts, media) available for analysis? Does it allow for meaningful connection to course themes and thinkers?
Voice & Commitment
Does the proposal reflect authentic critical engagement, curiosity, and a refusal of neutrality? Is there a clear sense of intellectual and political investment in the inquiry?

Note: Your proposal sets the tone for your work this semester. Choose a topic you care about deeply—one that fuels your curiosity, critique, and commitment to building liberatory futures.

Let your inquiry be radical, your analysis be sharp, and your imagination be unbound.

We are not here to fix oppressive systems. We are here to dismantle them.


Assignment 02

Rhetoric, Power, and Techno-Politics

100 Points 10–12 pages · Double-spaced · Excl. citations Due: March 15
Why This Assignment?

This is a radical rhetorical excavation. You will analyze how power speaks through and around technology—how violence is made to sound reasonable, how control is framed as care, and how resistance breaks through dominant narratives. Your task is to unmask the political work done by texts, visuals, and designs that shape our technological world.

This analysis directly prepares you for your final project, where you will move from critique to intervention.

What You'll Do

Select 2–3 materials linked to your semester topic that reveal how rhetoric operates in the service of—or against—systems of oppression. These can be (not limited to):

  • Official/powerful texts: Tech white papers, policy briefs, corporate ethics statements, surveillance patents, legal rulings.
  • Public/persuasive texts: Political speeches, news coverage, marketing campaigns, training materials.
  • Resistance texts: Protest art, zines, social media threads, abolitionist toolkits, music, counter-data visualizations.

All materials should be treated as non-neutral artifacts carrying ideological force.

Required Sections
1. Introduction (1–2 pages)
  • Re-state your topic and its stakes within systems of racial capitalism, colonialism, or carceral power.
  • Present your 2–3 materials with brief context for each:
    • What it is, where it circulated, who produced it.
    • Intended audience and purpose.
    • Why you chose it? What rhetorical power does it hold or challenge?
2. Critical Rhetorical Analysis (5–7 pages)

This is the core of your paper. Examine how each text works rhetorically. Consider:

  • Framing & Metaphor: How are AI, data, or surveillance described? (e.g., "smart," "neutral," "guardian," "threat")
  • Absence & Erasure: Whose voices, bodies, or harms are missing?
  • Emotional Appeals: What feelings are mobilized? (fear, urgency, trust, pride)
  • Visual & Design Rhetoric: How do layout, color, imagery, or interface design persuade?
  • Narrative & Logic: What stories are told about progress, safety, or inevitability?
  • Use direct evidence: quotes, images, descriptions.
  • Cite course thinkers to deepen your analysis.
3. Discussion: Radical Implications (2–3 pages)

Step back. Synthesize what your analysis reveals about:

  • How these texts reinforce or resist racial capitalism, settler colonialism, or carceral logics.
  • How liberal language ("fairness," "transparency," "innovation") can be weaponized.
  • Where you see ruptures: moments of refusal, subversion, or abolitionist imagination.
4. Conclusion: Toward Intervention (1–2 pages)
  • Reflect on what you've learned about rhetoric as a political force.
  • Pose lingering questions.
  • Explain how this analysis informs your final project vision—what will you build, resist, or reimagine?
5. Works Cited

MLA or APA style. Not included in page count.

Before Submitting — What to Look For
Category
What We're Looking For
Clarity & Focus
Clear topic, well-chosen materials, sharp thesis.
Analytical Depth
Goes beyond description to expose ideology, power, and harm.
Use of Evidence
Quotes, images, and examples are woven critically into analysis.
Course Engagement
Uses radical frameworks (abolitionist, decolonial, anti-racist).
Political Insight
Shows how rhetoric shapes material realities and possibilities.
Writing & Structure
Polished, organized, and compelling.

Remember: You are analyzing texts of power—some wield it, some resist it. Your job is to expose how language, image, and design are never innocent. They build worlds. Your critique is a step toward dismantling oppressive ones and imagining those that are liberatory.

Analyze with rigor. Write with purpose. Intervene with courage.


Assignment 03

Final Project — From Critique to Liberation Imaginaries

150 Points Total Final Project: 100 pts Class Sharing: 50 pts Due: May 7
Purpose

This is your culminating intervention—the moment where analysis transforms into action. Building on your semester-long investigation of AI, power, and oppression, you will now propose, create, or enact an abolitionist alternative.

Your project must move beyond critique toward liberatory imagination, grounded in anti-racist, decolonial, and abolitionist frameworks.

Two Deliverables
Part I
Final Project
100 points
Part II
Sharing about the project in the class
50 points
Final Project (100 Points) — Choose One or Mixed Format
Option A
Academic Paper — Abolitionist Blueprint

Length: 10–12 pages, double-spaced

Focus: Analyze existing abolitionist tech movements or community resistance. Propose concrete steps to dismantle oppressive systems and build liberatory alternatives.

Sections:

  • Introduction: Your topic + its stakes in abolitionist struggle
  • Analysis of Resistance: How are communities refusing or reimagining tech?
  • Abolitionist Proposals: 3–5 actionable steps for dismantling and rebuilding
  • Conclusion: Vision for a post-carceral technological future
Option B
Action Research — Groundwork for Liberation

Engage directly with a community/organization resisting carceral tech.

Submit:

  • Fieldwork Journal (ongoing reflections)
  • Final Report (8–12 pages) detailing:
    • The organization's abolitionist work
    • Your role and contribution
    • Analysis of their strategies
    • Co-created recommendations
Option C
Multimodal Project — Liberatory Worldmaking

Create a public-facing work that educates, mobilizes, or prefigures abolitionist futures.

Formats include: documentary, zine, toolkit, podcast, game design, PSA, workshop, digital archive, speculative art, counter-surveillance tool, abolitionist app prototype, etc.

Deliverables:

  • The creative work itself
  • Creator's Statement (1–2 pages): explains your political and aesthetic choices
Before Submitting — What to Look For
Criterion
What We're Looking For
Abolitionist Alignment
Does the project reject reformist solutions and center dismantling + rebuilding?
Political Clarity
Clear stakes, audience, and theory of change. Moves beyond awareness to action.
Rigor & Creativity
Depth of research/engagement + innovative form/content. Thoughtful design choices.
Impact & Feasibility
Recommendations are actionable; creative work is engaging and resonant. Shows understanding of real-world context.
Presentation & Polish
Compelling, well-produced, professionally presented.

This is not the end of our learning. It is the beginning of our praxis. Build with intention. Imagine without limits. Interrupt what must be destroyed. Create what must live. Intervene for liberation.

07

Four Core Questions We Will Wrestle With

Source: Federal contracts — Digital Destroyers bigtechsellswar.com · Amazon DoD 38% · Google DoD 61% · Microsoft DoD 40%

Question 01

How does artificial intelligence entrench existing systems of power—racial capitalism, cisheteropatriarchy, ableism, and settler colonialism—under the guise of neutrality and innovation? What forms of systemic violence does it automate, legitimize, or obscure?

Question 02

In what ways are algorithmic systems—predictive policing, biometric surveillance, risk scoring—actively complicit in expanding the prison-industrial and military-industrial complexes? How do they refine the logics of captivity, war, and occupation?

Question 03

How does the extraction and commodification of data—through digital profiling, surveillance capitalism, and data colonialism—undermine autonomy, erode collective power, and reproduce colonial domination? What is our role, as scholars, in either sustaining or refusing this system?

Question 04

What radical pathways exist for resisting and dismantling AI infrastructures of domination? How might we imagine and co-create technologies in service of collective liberation, rather than control? What does abolitionist, decolonial, and life-affirming AI look like—if it is possible at all?

08

Essential Viewing — Embedded Documentaries & Lectures

Every student must watch all of these before Week 4. Each runs under 30 minutes except where noted. Approach them as primary sources — treat what the experts say about their own systems as evidence.

Netflix · MIT Media Lab · Joy Buolamwini · 4 min
Joy Buolamwini discovers Amazon Rekognition cannot detect her dark-skinned face. MIT research, congressional testimony, global policy consequences. Full documentary on Netflix.
Vox · 14 min · 2022
Concrete technical breakdown of how Palantir's Gotham platform fuses government databases to build surveillance profiles on millions of people.
MIT Media Lab · 2018 · Landmark Research
The paper that exposed racial and gender bias in commercial facial recognition systems. IBM error rate: 0.3% on lighter men, 34.7% on darker women.
Harvard Institute of Politics · 30 min
Benjamin argues that discriminatory design is not a bug but a feature — "the New Jim Code" encodes racial hierarchy into algorithmic systems presented as neutral.
New York Times Visual Investigations · 2020
NYT investigation into Clearview's database of 30+ billion scraped faces. First major mainstream exposé of the company's reach into law enforcement.
Science Friday · 24 min · Ruha Benjamin + Deborah Raji
Ruha Benjamin and Deborah Raji discuss how FRT failures are not technical accidents — they are predictable outcomes of building systems on biased data.
Science Friday · 12 min
How racial bias operates not just in AI outputs but in who builds the systems, who funds the research, and who is imagined as the end user.
Netflix · 2020 · Engineers Speak Out
Former engineers at Facebook, Google, Twitter speak on the deliberate design of addictive systems. Context for understanding platform complicity in surveillance capitalism.

CLASS
POLICIES

Class Policies
Attendance.
Attendance: Students make all reasonable efforts to attend all class meetings. In the event of absence: inform the instructor as soon as possible. But hey! This class is not the only life you are living. It is just a tiny fraction of your life. So, let's talk.
Class Policies
Late Work.
Late Work: Assignments submitted by posted deadlines — clearly listed here and in Canvas.

Extension: Late assignments accepted without penalty if you email at least 2 days before the deadline and an alternative deadline is mutually agreed upon.

But again, this class is not the only life you are living. It is just a tiny fraction of your life. So, let's talk and figure out the alternatives.
Inclusion & Accessibility
Every Body-Mind-Heart Learns Differently.
No two people learn exactly the same way. If you find that the materials are difficult for you to absorb, don't assume right away that you don't understand the material. Perhaps you prefer to process information through speaking or listening, but all I am providing are written handouts, making it difficult for you to process. Please come speak with me if you would like to think through other options for engaging with the material and activities in the course.

Disabilities are visible and invisible, documented and undocumented: I do not distinguish between these designations. If you have a disability, or think you may have a disability, I encourage you to speak with me as soon as you can about your learning needs and how I can best accommodate them.

If there are aspects of the design, instruction, and/or experiences within this course that result in barriers to your inclusion or accurate assessment of achievement, please notify me as soon as possible and/or contact Student Accessibility Services. You may contact Accommodations and Services without notifying me if you wish; you may also speak with me without contacting Accommodations and Services at all. I do not require documentation for accessibility in my classroom.
Life First
This Is Just a Tiny Fraction of Your Life.
Dear y'all, as we are here together, hopefully a transformative journey of exploration and reflection where we will engage in the radical work of anti-racism—a collective act of love, care, solidarity, critical thinking, and intellectual labor. Together, we will interrogate systems, cross intellectual and cultural borders, and examine anti-racism as a framework rooted in justice across intersecting identities and global contexts. Guided by and learning from scholars, artists, activists, and each other, we will envision transformations within ourselves and our communities (and if possible, beyond that). Your voices and perspectives will shape our space, teaching me as much as I hope to share with you. I am here to be with you every step of the way, so please don't hesitate to reach out with questions, concerns, or if you simply need to to de-stress. Don't let this encourage you to dehumanize yourself. Let's embark on this meaningful inquiry together by also keeping in mind that your emotional and mental wellbeing matters a lot.

And yes, this is also one of the class policies.
AI_USE_POLICY.txt — opened
AI USE
POLICY_
ACCEPTABLE
✓ Permitted Uses
LEARNING: Personalized platforms, AI tutoring, study organization.
RESEARCH: Brainstorming, ideation — critically assess and cite all AI output.
COLLAB: Facilitating group tasks — never replacing human contribution.
UNACCEPTABLE
✗ Prohibited Uses
PLAGIARISM: AI-generated assignments presented as original work. Prohibited.
EXAM_MISUSE: Unauthorized AI in assessments = academic misconduct.
PRIVACY: Collecting others' data, generating impersonation content.
DISCLOSURE REQUIRED
◎ Transparency Protocol
All AI use must be disclosed in submissions. (e.g. "Generated using ChatGPT, edited for accuracy.")
Paraphrased or quoted AI content must be cited per APA or MLA Style Guide.
Do not input sensitive personal data into AI systems. No deepfakes or misleading content.
This AI use policy was itself generated using ChatGPT and edited for accuracy.
11

Live Platforms — Explore These Directly

These are the actual companies and platforms. Open them. Read their marketing copy. Then compare it to the investigative reporting and academic research. That gap is the course.

Palantir Gotham
palantir.com/platforms/gotham
$2.4B government contracts (2024) · ICE, FBI, LAPD, US Army · Founded with CIA In-Q-Tel funding
Their law enforcement product. Read the product docs. Note the language: "data integration," "operational efficiency." Then read The Intercept's reporting on the same platform.
Palantir AIP (AI Platform)
palantir.com/platforms/aip
$480M US Army contract (2024) · LLMs for military decision-making · Real-time battlefield AI
Their newest product — large language models deployed for defense. This is the present frontier of AI and warfare.
Clearview AI
clearview.ai
30B+ scraped faces · 3,100+ agencies · Banned: EU, Canada, Australia, UK
First Amendment as surveillance right. They scraped your face from every public photo on the internet. Still operating in the US. Read their FAQ — then read the ACLU complaint.
SoundThinking (ShotSpotter)
soundthinking.com/policing
89% deployment in Black neighborhoods · 40,000+ false alerts Chicago (2019–21) · Secret contracts
Acoustic surveillance. Explore their dashboard. Note who is described as a "customer." Read the Chicago OIG report on false alerts alongside the company website.
Atlas of Surveillance (EFF)
atlasofsurveillance.org
Free · Built by journalists + students · Every US city mapped · Drones, FRT, ShotSpotter, ALPRs
Search your city. Find which surveillance technologies are deployed in your neighborhood. Built using public records and crowdsourcing by the EFF and University of Nevada journalism school.
Algorithmic Justice League
ajl.org
Joy Buolamwini · Gender Shades dataset · Congressional testimony · FRT bias audits
Buolamwini's organization. Research, policy, art. The Gender Shades dataset is downloadable. Read the technical papers and the policy briefs.
GitHub Repositories — Read the Actual Code
github.com/propublica/compas-analysis
ProPublica's full R analysis of COMPAS racial bias. Entire dataset, Jupyter notebooks, statistical methodology. The analysis that sparked the algorithmic sentencing debate. Run it yourself.
★ 3.2k
github.com/Trusted-AI/AIF360
IBM's AI Fairness 360 — the bias detection and mitigation toolkit corporations claim to use. Examine what it measures, what it can fix, and what it structurally cannot.
★ 2.4k
github.com/MadhumitaSushil/llm-bias-survey
Survey of LLM bias across race, gender, religion. Academic benchmark suite with full code. Relevant to critiquing every AI tool you use daily.
★ 890
github.com/google/model-card-toolkit
Google's model card documentation system. What "ethical AI" reporting looks like in practice. Critical exercise: evaluate what these cards do and do not disclose.
★ 420
github.com/microsoft/responsible-ai-toolbox
Microsoft's Responsible AI toolkit. Examine the technical definition of "responsible" a $3T company applies to its own AI systems. Note the limits.
★ 1.8k
github.com/cleverhans-lab/cleverhans
Adversarial attacks on ML models. Understanding how surveillance systems can be fooled is itself a form of resistance. Built by Ian Goodfellow and Nicolas Papernot.
★ 6.1k
Key Datasets & Databases
ProPublica COMPAS Dataset
propublica.org/datastore/dataset/compas-recidivism-risk-score-data-and-analysis
Full defendant data + COMPAS scores for Broward County. The primary dataset behind Machine Bias. Download, analyze, form your own conclusions.
AI Incident Database
incidentdatabase.ai
Every documented AI harm and system failure globally. Searchable by harm type, technology, geography. The definitive catalogue of algorithmic violence.
Gender Shades Dataset
gendershades.org
Buolamwini & Gebru benchmark faces with full FRT accuracy results by skin tone and gender. The dataset that proved discrimination quantitatively.
Vera Institute: Incarceration Trends
vera.org/projects/incarceration-trends
Incarceration by race, county, year. Interactive. The statistical foundation for understanding carceral AI's real-world scale.
Big Tech Sells War Database
bigtechsellswar.com
Federal contracts between Amazon, Google, Microsoft and DoD, DHS, DoJ. Amazon: 38% DoD. Google: 61% DoD. Microsoft: 40% DoD.
Surveillance Technology Oversight Project
stopspying.org
NYC surveillance tech documented by civil liberties lawyers. FOIA requests, contracts, legal challenges.
12

INDICATIVE COURSE MATERIAL

Is artificial intelligence racist? Adib-Moghaddam, A. (2023). Pluto Press.
Race After Technology. Benjamin, R. (2019). Polity. → The New Jim Code
More than a glitch. Broussard, M. (2023). MIT Press. → How bias enters ML pipelines technically
Dark matters: On the surveillance of Blackness. Browne, S. (2015). Duke UP. → Surveillance as anti-Black technology from slave ships to biometrics
Eloquent rage. Cooper, B. (2018). St. Martin's.
Biopolitics of security. Dillon, M. (2015). Routledge.
Discipline and Punish. Foucault, M. (1975). Vintage. → Historical root of algorithmic carceral logic
The History of Sexuality, Vol. 1. Foucault, M. (1976). Pantheon.
Security, Territory, Population (Lectures 1977–78). Foucault, M. (2004). Palgrave.
The Birth of Biopolitics (Lectures 1978–79). Foucault, M. (2004). Palgrave. → Theoretical foundation of biopolitical AI critique
Security aesthetics of and beyond the biopolitical. Ghertner et al. (Eds.). (2023). Duke UP.
No mercy here. Haley, S. (2016). UNC Press.
Scenes of subjection. Hartman, S. (1997). Oxford UP.
When they call you a terrorist. Khan-Cullors & Bandele. (2018). St. Martin's.
Racism and racial surveillance. Khan et al. (Eds.). (2023). Palgrave.
Hidden in white sight. Lawrence, C. (2023). LID Publishing.
Necropolitics. Mbembe, A. (2019). Duke UP. — PRIMARY TEXT → Original 2003 essay: DOI 10.2307/3250559 (open access)
Algorithms of Oppression. Noble, S.U. (2018). NYU Press.
So you want to talk about race. Oluo, I. (2018). Seal Press.
Weapons of Math Destruction. O'Neil, C. (2016). Crown. → Mathematical proof of how 'neutral' models cause targeted harm
Artificial Unintelligence. O'Neil, C. (2019). Crown.
Terrorist assemblages. Puar, J.K. (2007). Duke UP.
Weapons of the Weak. Scott, J.C. (1985). Yale UP.
Against ecological sovereignty. Smith, M. (2018). U Minnesota.
Captive genders. Stanley & Smith (Eds.). (2015). AK Press.
The Age of Surveillance Capitalism. Zuboff, S. (2019). PublicAffairs. → THE technical anatomy of data extraction as economic model
13

Popular / News Articles & Scholarly Sources

Machine Bias: There's software used across the country to predict future criminals. And it's biased against Blacks.
propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
Dealing with Bias in Artificial Intelligence
nytimes.com/2019/11/19/technology/artificial-intelligence-bias.html
Art Project Shows Racial Biases in Artificial Intelligence System
smithsonianmag.com/smart-news/art-project-reveals-racial-biases-ai-180972586
Artificial intelligence is creating a new colonial world order
technologyreview.com/2022/04/19/1049592/artificial-intelligence-colonialism/
How the AI industry profits from catastrophe
technologyreview.com/2022/04/20/1050392/ai-industry-appen-scale-data-labels/
The NYT Op-ed Page Just Published an AI Weapons Infomercial
inkstickmedia.com/the-nyt-op-ed-page-just-published-an-ai-weapons-infomercial/
Depicting Palestine: What it means to witness a genocide
Scholarly / Academic Articles
Dissecting racial bias in an algorithm used to manage the health of populations
science.org/doi/10.1126/science.aax2342
Race and Rights in the Digital Age
cambridge.org/core/journals/ajil-unbound
14

Weekly Assigned Materials

Two Weekly Tasks Due Every Wednesday Midnight (Unless Mentioned Otherwise): Scaffolding Assignment & Reading Response  ·  Three Major Assignments: Check Canvas for Due Dates

⚠️

Note: Some links in the assigned materials may take you to the WSU library. Only WSU credentials will allow access to those materials.

Week 1

Let's take a look at this crucial work cursorily for our first day's conversation.

Week 2 Foundations — Biopolitics, Necropolitics, and AI
Week 3 Colonial Legacies and Racial Science in Tech
Week 4 Surveillance of Blackness and Social Control
Week 5 From "Minority Report" to Reality — Predictive Policing & Algorithmic Injustice
Week 6 The Carceral Continuum — From Prison Tech to Everyday Life
Week 7 Health, Medicine, and Biopower — Carceral Science and the Body
Week 8 Data Capitalism, Digital Profiling, and Inequality
Week 9 Intersectionality — Gender, Sexuality, and the Carceral Tech Gaze
Spring Break

Nothing is due this week.

Week 10 Militarization, Borders, and Technoscientific Violence
Week 11 Abolitionist Tech Movements and Digital Activism
Week 12 Abolitionist Tech Movements and Digital Activism
Week 13 Speculative Futures — Afrofuturism, Indigeneity, and Imagination
Week 14 Conclusion — Ethics, Care, and Future Directions
Weeks 15 & 16
01001110 01000101 01000011 01010010 01001111 00100000 01000001 01001001 00100000 01001011 01001001 01001100 01001100 01010011
NECRO AI KILLS — what systems do when we do not look.