Do Not Design: A Starter List of Ethically Dangerous Employers
#015: When elegant systems mask violent outcomes, it’s time to step away from the keyboard.
Job listings won’t say “help us design an interface to make police profiling more efficient.”
As designers we like to talk about impact and elevating experiences by solving user problems. Rarely do we talk about the other kind of impact, though — the kind that leads to deportation raids, drone strikes, predictive arrests, or the silent expansion of a surveillance state built on the backs of our carefully orchestrated user interfaces.
If you're a designer working in tech today, you’re likely closer to that harm than you think.
Sometimes we know it: the UI for a drone flight control system or a command center dashboard for ICE doesn’t exactly scream “neutral product.” But often it's more subtle. The design is elegant, KPIs are clear, and team culture good. But behind the scenes, your design is enabling something you might not be okay with if you stopped to really look and consider.
It’s time we start naming names and setting lines.
Design Is Never Neutral
Design is power. It doesn’t just shape what people can do, it shapes what they will do. It removes friction, guides choices, and intentionally builds systems that scale. It can also cloak those systems in legibility and legitimacy. Good design makes harmful systems more usable, more palatable, and more efficient.
That’s why so many ethically questionable tech companies desperately need good designers. Because without us, their systems are hard to use and harder to sell.
Companies to Avoid
There’s a growing list of companies whose business models or contracts with governments raise serious ethical red flags. Here are just a few where designers should think twice (or not at all) before accepting the offer, some fairly obvious and others not so much:
Palantir
Palantir builds powerful surveillance and data analysis tools used by governments and law enforcement agencies around the world. Their products have been central to ICE deportation efforts in the U.S., enabling mass data capture and targeting of undocumented immigrants. In other words: Palantir’s tools help track down people for deportation — and make it look good while doing it.
As The Intercept reported, Palantir’s backend systems were specifically designed to ingest data from license plate readers, arrest records, and social media, allowing ICE to conduct sweeping raids with terrifying efficiency.
If you design those interfaces, you’re not “just improving data visibility.” You’re improving the speed and accuracy of deportations.
Anduril Industries
Founded by Oculus co-founder Palmer Luckey, Anduril is building autonomous warfare tech including drone surveillance towers and battlefield AI systems. Their stated goal is to “transform U.S. and allied military capabilities with advanced technology.”
Anduril’s ‘virtual border wall’ towers may not just deter border crossings — they actively push people into lethal terrain.
Translation: They’re building tools for war. Not hypothetically or metaphorically. Literally.
And Anduril is also a leading purveyor of surveillance technologies with more than questionable outcomes. Anduril’s ‘virtual border wall’ towers may not just deter border crossings — they actively push people into lethal terrain. A Coda Story investigation revealed that surveillance infrastructure funnels migrants into deserts and mountain passes, increasing the risk of dehydration, heatstroke, and death.
Similarly, The Guardian documents how towers near Otay Mountain Wilderness channel crossings into hostile environments. Additional reporting underscored that AI-enabled detection systems drive migrants deeper into the unlivable desert. Studies compiled by Migration Policy and the No Tech for ICE report offer data showing the “funnel effect” correlates with rising migrant mortality in surveillance-heavy zones.
If you sign up to improve Anduril’s user onboarding or craft its design system, you're enabling autonomous surveillance towers used at the U.S.-Mexico border and potentially on future battlefields. That’s not “disruptive innovation.” That’s militarized product design.
Meta
Often viewed as a consumer tech giant, with a rebrand focused on social VR and the “metaverse”, Meta also plays a quiet but growing role in real-world surveillance and military technology.
Recently, Meta partnered with Anduril Industries (yes, that one) with a focus on advancing AR systems for military use, leveraging Meta’s hardware and Anduril’s battlefield software to create next-generation situational awareness tools. A consumer social company and a war-tech startup now share a vision and a pipeline. You can even read Anduril’s effusive announcement of the partnership here.
The implications go far beyond “mixed reality.” It’s a pipeline from lifestyle gadget to battlefield interface and designers are in the middle of it.
More recently, a CBP agent was photographed wearing Meta’s smart glasses during an immigration raid in Los Angeles. The glasses — co-branded with Ray-Ban — include a camera and microphone, enabling always-on recording with minimal visibility. Meta claims the glasses are not intended for law enforcement use and denies involvement in this specific deployment. But the reality is simple: the tools exist, and they’re being used.
Designers at Meta may believe they’re building for creators, gamers, or the next generation of social interaction, but these systems don’t live in a vacuum. They’re entering homes, public spaces and now ICE raids and defense contracts.
Clearview AI
Clearview scrapes billions of photos from social media and other public websites to train facial recognition algorithms — often without consent. Their software is marketed to police departments and other security agencies for real-time identification.
This isn’t a fringe startup, it’s a company whose product was deemed dangerous by the Electronic Frontier Foundation and has been banned or restricted in multiple countries. The risk of false positives, racial bias, and misuse is extremely high and the implications for civil liberties are profound.
If you help improve Clearview’s UI, you're not “making face search faster.” You’re helping police identify protestors and surveil communities without warrants or oversight.
Axon (formerly Taser)
Axon sells body cams, tasers, and police software. While they’ve attempted to position themselves as a reform-minded tech company, their push toward facial recognition and real-time surveillance caused their own ethics board to resign in protest.
The resigning members cited serious concerns about racial bias and the potential for civil rights violations. If the company’s own ethics board walks out, that should tell you something.
Designing for Axon isn't civic tech — it's compliance theater.
Amazon (Ring + Rekognition)
Amazon’s consumer surveillance empire extends beyond shopping. Ring doorbells — marketed as neighborhood safety tools — have fueled over-policing and racial profiling. Rekognition, Amazon’s facial recognition software, was sold to law enforcement with little oversight before being “paused” after public backlash.
Even if your job is technically on the “Alexa” team or in “retail innovation,” the gravity of Amazon’s surveillance capabilities and its willingness to sell them should give any designer pause.
How to Spot the Red Flags
It’s not always obvious. Job listings won’t say “help us design an interface to make police profiling more efficient.” So here are a few questions to ask when evaluating a company:
What is this product ultimately used for? And by whom?
Could this system be used to harm, profile, or surveil someone unfairly?
Are there public concerns or reporting about the company’s practices?
Would I feel comfortable telling someone impacted by this product what I do?
Is the company's biggest “customer” the government or military?
Even better: Google “[company name] + EFF” or “[company name] + surveillance.” You’ll learn more in five minutes than from a recruiter’s sales pitch.
What About “Change from the Inside”?
This is the classic pushback: If ethical designers don’t work there, won’t it just be worse?
Maybe. But that’s not the point.
The question isn’t whether your absence would make the company worse. The question is whether your presence makes you complicit. At a certain point, staying inside becomes less about change — and more about comfort, career, or convincing yourself the work isn’t as harmful as it clearly is.
Change can happen from within. But not if the system is built to do harm, and especially not if your role is to optimize it.
Sometimes, the most powerful act of design is refusal.
What We Can Design Instead
It’s easy to feel disillusioned after seeing how design can be weaponized. But the flip side is just as powerful: design can also be used to liberate, protect, and repair.
There are companies quietly doing this work — often without the attention (or funding) of their more harmful counterparts. They might not be recruiting at fancy conferences or offering Silicon Valley-level compensation packages, but they’re building systems worth believing in.
A few worth noting:
Uptrust – A public-interest tech company that helps reduce unnecessary jail time by reminding people of court dates and connecting them to social services — instead of punishing them for poverty.
SimpliGov – A platform used by government agencies to streamline services for people, not against them — like applying for food assistance or pandemic relief.
Ushahidi – A nonprofit tech company born out of the Kenyan election crisis, building open-source tools to map violence, corruption, and disaster response efforts around the globe.
These aren’t the only ones, but they are a reminder: There’s no shortage of problems worth solving. Only a shortage of attention given to the people solving them ethically.
Next Thursday I’ll share a deeper list of purpose-driven companies, co-ops, and nonprofits where designers can put their skills to work without compromising their values. If this week was about red flags, next week is about green lights.
Because design doesn't just shape products. It shapes futures.