Let’s cut to the chase: if your product requires users to dig through labyrinthine settings just to stop you from hoarding their personal data, you’re not designing—you’re manipulating. And it’s time we call this out for what it is: a failure of ethics disguised as UX.

For years, the design community has celebrated “dark patterns” as cautionary tales. We shake our heads at sneaky pre-ticked boxes, endless cookie banners, or “are you sure you want to unsubscribe?” popups.

But let’s be honest—those weren’t cautionary tales, they were industry norms. They became the default way to design the web. Now, with AI everywhere—listening, recording, predicting, nudging—defaults aren’t just about convenience anymore. They’re about power.

Defaults Are Not Neutral

Here’s the uncomfortable truth: defaults are decisions. They are ethical stances dressed up as “user experience.” When you decide that the default is “collect everything unless the user opts out,” you’re not being neutral—you’re being extractive.

Think about it: almost nobody changes defaults. Studies show that 90% of users stick with whatever the system hands them. So if you design your product with a privacy-hostile default, you’ve effectively made the ethical choice for them. Except it’s not ethical—it’s exploitation through inertia.

The AI Factor: Amplified Exploitation

Before AI, data collection was creepy. With AI, it’s radioactive. Training algorithms on user behavior, private messages, and clicks isn’t just about “improving personalization.” It shapes what the AI knows, how it behaves, and ultimately, what the user sees as truth. If the default is “collect everything,” you’re not just designing a product—you’re curating reality.

And here’s the kicker: most users have no idea what data is being collected, how it’s being used, or how long it sticks around. A buried toggle in “Advanced Settings > Privacy > Data Sharing > Miscellaneous” isn’t user control—it’s plausible deniability.

Ethical Defaults as a Design Principle

If design is about responsibility, then the principle is simple: users shouldn’t have to fight for their own privacy. The default should be safety, transparency, and minimal data collection. Full stop.

That means:

  • Location services are off until explicitly requested.
  • Microphones and cameras are off unless actively used.
  • Data retention is minimal unless users opt in for more.
  • Explanations of how AI systems use data are clear, not buried in legalese.

Yes, this might reduce the short-term metrics your CEO obsesses over. Yes, it might mean fewer ads “personalized” to the fact that you searched for hemorrhoid cream once. But it also builds the one metric that actually matters long term: trust.

The Backlash Designers Don’t Want to Hear

Here’s where the controversy kicks in: many designers reading this will quietly think, “But if we don’t collect data, how will we compete? Everyone else is doing it.”

To which the answer is: that’s exactly the point. Everyone else is busy strip-mining user trust for quarterly growth. You don’t win the future by copying the worst of Silicon Valley. You win it by building something people don’t feel slimy using.

Take Apple. Love them or hate them, their branding around privacy-as-default has carved out a position so strong that entire ad networks have had to retool. They didn’t just flip a toggle—they turned ethics into a market advantage.

Transparency Theater Isn’t Enough

Now, some companies try to sidestep the issue with what I call Transparency Theater. They dump giant PDFs of “Your Privacy Choices” on the user. They create dashboards so confusing you need a law degree to parse them. They offer “controls” but quietly nudge users back into sharing more data through dark patterns.

This isn’t ethics. It’s theater. And users are catching on.

Why Designers, Not Just Lawyers, Own This

Here’s the uncomfortable part for us: defaults are a design problem, not just a policy one. Every time you make a decision about what’s checked, what’s hidden, what’s on by default, you’re taking a stance. Pretending it’s “just business” is an abdication of responsibility.

As designers, we like to see ourselves as champions of the user. Well, championing the user in the AI era means standing up to the business models that want to hoard, predict, and manipulate.

If we’re complicit in burying controls, hiding opt-outs, or nudging people into sharing more than they intended, then we’re not just “shipping features.” We’re engineering consent.

The Future of Defaults

So what would it look like if we got this right? Imagine a future where:

  • AI assistants default to forgetting what you say unless you explicitly ask them to remember.
  • Social platforms default to private accounts, with sharing as an active choice, not the other way around.
  • Recommendation systems default to transparency, showing why they’re surfacing certain content.
  • Data collection defaults to “off,” and personalization defaults to optional.

These aren’t utopian fantasies. They’re product decisions waiting to be made. And if enough companies make them, they’ll reset the baseline for the entire industry.

Final Thought: Who Do You Work For?

At the end of the day, the question is simple: who do you work for—the user, or the quarterly earnings call? Every checkbox you design, every toggle you bury, every default you set is an answer to that question.

In the AI era, where the stakes are exponentially higher, ethical defaults aren’t a nice-to-have. They’re the difference between designing a future that respects human dignity—and one that strips it for parts.