Are you designing for the user’s values — or your own?
The future of design will be the negotiation between multiple moral worlds.
It’s not a stretch to suggest that as technology becomes more omnipresent, the designer’s role will shift from “hands-on” making to shaping how humanity interacts with machines. We will evolve from designing screens to defining moral guardrails. And like any profession with the power to influence lives — doctors, lawyers, policymakers — designers will need a strong foundation in ethics.
This is the core reason I developed the Five Pillars of Ethical Interface Design — an evolving framework I actively invite other designers to engage with and offer feedback on. The pillars are meant to push designers to confront the moral weight of their decisions rather than hide behind aesthetics, heuristics, or empathy theater.
The truth is designers rarely operate from a neutral position. Even when they insist they’re “designing for the user,” their own assumptions, values, and biases inevitably slip into the work — usually without them noticing.
This is mainly due to what can be described as ethical misalignment — the gap between the user values designers think they’re honoring and the internal values that actually shape their decisions.
One of the main drivers of this misalignment is the industry’s habit of treating empathy as a stand-in for values — and sometimes as permission to slip personal priorities into the work.
Designers start assuming that a superficial understanding what someone experiences automatically tells them what should matter. It’s a comforting belief. It’s also wrong.
As Don Norman has argued, designers can’t actually have empathy in any literal sense — we can only observe behavior and imagine internal states. And imagination is unreliable.
The worst example of this is applying empathy before actual research occurs—projection masquerading as insight. We all do this, and it feels like basic human decency, but it can be dangerous to assume you know what another individual experiences or what their values actually are.
Empathy clarifies what a user experiences, but it does not tell you which experience should matter more. It doesn’t decide which trade-offs are acceptable, or how to weigh one user’s needs against another’s. Those choices come from values, not empathy.
Part of the problem is that traditional UX methods don’t actually measure user values. They measure usability, preferences, and observable behavior — but none of that tells you how a person morally prioritizes inclusion, autonomy, privacy, transparency, or well-being.
Because those deeper values remain invisible, designers end up filling the gaps with their own assumptions. That’s how teams convince themselves they’re “designing for the user” when the product is quietly reflecting the team’s values instead of the user’s.
How designer values sneak into the interface
It shouldn’t be surprising that everyone’s moral compass points in different directions. Flip between Fox and CNN for thirty seconds and you’ll see two completely different worldviews presented as fact.
Just as news outlets present competing realities, designers construct realities too — every interface carries an implicit stance about what should matter. For example, when inclusion rises to the top of a designer’s priorities, the user experience naturally becomes more nuanced. Accommodating a wider range of needs often softens the straightforward simplicity a majority audience might expect.
If well-being dominates, the pattern shifts again. Efficient or engaging features get softened or buried to reduce compulsion. That may feel responsible inside the team, but it frustrates users who value speed, momentum, or freedom.
And if transparency is the user’s priority, the need for disclosure might be ignored in the name of a clean, frictionless flow. Teams hide or compress information instead of presenting it fully. It satisfies an internal bias for elegance, but it leaves users with less clarity and less agency.
These types of choices rarely come directly from user research. They come from the ethics of the team itself — values that quietly harden into the architecture of the product.
Why explicit ethical structures matter
Ethical value tensions show up in nearly every product decision, yet the industry still struggles to confront them explicitly. Batya Friedman’s Value Sensitive Design (VSD), developed decades ago, was one of the first efforts to systematize this concept. It provided a structured way to examine which human values ought to guide technology — from the conceptual level to empirical research to technical implementation.
However, VSD has not gained widespread adoption in industry, largely because it leans toward an academic framework rather than a practical tool designers can apply to everyday product decisions. It’s also worth noting that any system attempting to prescribe a moral framework ignores the ethical dilemma David Hume identified—you cannot derive an ought from an is.
My framework ties to avoid that trap. Instead of prescribing specific moral priorities, it exposes the gap between the values teams believe they’re honoring and the values users actually experience once those values are expressed in the interface. It’s not about telling designers which values to adopt — it’s about making the value tensions visible so trade-offs are conscious rather than accidental.
But before we can make ethical decisions, we need to understand the user’s values with precision. That’s why I’ve started to develop a simple Ethical Interface Design evaluation tool to sit alongside the Five Pillars of Ethical Interface Design framework.
The tool uses a short 10-question survey. For each of the five pillars (inclusion, autonomy, transparency, privacy, and well-being), users answer two prompts: one that captures their ethical preference — how important that pillar is to them — and another that evaluates how well the product currently delivers on that value, each rated from 1 to 5.
Together, the five pillars and the evaluation tool are not about prescribing which values you should hold. They expose the gap between what users actually prefer and the moral defaults the product is quietly imposing. The math is simple:
User preference (1–5) minus Product score (1–5) = Gap.
- 0 → aligned
- Negative → you’re not meeting the value the user actually wants
- Positive → you’re pushing a value harder than the user prefers
The key to making this survey work is to implement questions that expose ethical design tradeoffs. For example, if we look at the pillar of transparency (note: these questions are still being refined):

Across a group of users, the patterns show you exactly where your design values match the audience — and where your own worldview is leaking into the interface.
The ethical pivot point
Technology is no longer just entertainment, utilities, or conveniences.
It mediates attention, communication, decisions, even identity. That makes every value baked into an interface more consequential. We’re not arranging screens anymore. We’re shaping the logic of human–machine interaction — and by extension, shaping behavior and culture.
By understanding user values through ethical design frameworks and tools we can help designers see where their instincts, assumptions, and moral defaults are quietly overriding the people they’re designing for. Not to eliminate value tension — that’s impossible — but to recognize it and make trade-offs consciously instead of unconsciously.
Ethical design isn’t about being virtuous. It’s about being honest with yourself about the values you’re imposing.
And in the end, it comes down to one unavoidable question: Are you designing for the user’s values — or your own?
References
- Bias in computer systems: https://dl.acm.org/doi/10.1145/230538.230561
- Why I Don’t Believe in Empathic Design: https://medium.com/thinking-design/why-i-dont-believe-in-empathic-design-c3dd0a956de9
- Value-Sensitive Design: https://dl.acm.org/doi/pdf/10.1145/242485.242493
- Do Artifacts Have Politics?: https://monoskop.org/images/8/8c/Winner_Langdon_1980_Do_Artifacts_Have_Politics.pdf
Don’t miss out! Join my email list and receive the latest content.
Are you designing for the user’s values — or your own? was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
