This content originally appeared on DEV Community and was authored by Osman Gunes Cizmeci
I didn’t always talk explicitly about values in my design process. Early in my career, I treated ethics, inclusion, privacy — all of that — as constraints or “nice to haves” you layered in at the end. Over time, though, I’ve come to believe they must be foundational. In fact, I now see value-sensitive design (VSD) as a kind of compass that keeps me anchored in what really matters: creating technology that respects people.
What is value-sensitive design?
At its core, VSD is a design methodology that integrates human values systematically throughout the design process. It encourages you to ask: Which values are at stake? Who are the stakeholders? How might design decisions privilege or harm them?
Because values are rarely obvious or universal, VSD is inherently multi-layered. It asks us to iterate between conceptual investigations (what do users care about?), empirical investigations (how do users behave, feel, or push back?), and technical investigation (how can our systems support or thwart values) — all in a loop.
Why It’s Urgent in 2025
We’re at a moment where interfaces, AI systems, and immersive platforms are so pervasive that the stakes of design decisions feel existential. As technology progresses, the hidden value trade-offs are becoming more visible.
Opaque AI influence — Interfaces personalize so deeply now that decisions are sometimes invisible to users. When the logic is opaque, how do users trust or contest those decisions?
Data & privacy flux — Our designs often require data to function, but more and more users are wary of what’s collected, how it’s used, and who owns it.
Diverse contexts of use — A “one size fits all” design is more dangerous than ever. What feels seamless in one culture or environment might feel invasive or alien in another.
In a sense, VSD feels like a necessary antidote to the “build fast, iterate later” culture. If we skip value thinking early, we end up retrofitting or, worse, inflicting harm.
How I Use Value-Sensitive Design in My Work
I’ve adapted VSD to fit my own process. Here are a few practices I’ve integrated (and refined, sometimes painfully):
- Value Mapping Before Wireframes
Before sketching anything, I explicitly map values in tension — transparency vs. simplicity, convenience vs. consent, efficiency vs. reflection. I sketch “value maps” that visualize how design decisions might push users one way or another.
This map becomes my north star during design reviews. Whenever a team member suggests a shortcut, I ask: “Which side of our value map does this lean toward?”
- Stakeholder Interviews + Value Probes
Beyond standard user interviews, I introduce probes (surveys, scenario exercises, conceptual cards) to surface hidden values. I ask: What makes you feel in control? What feels invasive? The answers often surprise me.
These probes help me see values users care about — sometimes more than features themselves.
- Value Testing
In usability tests, I don’t just ask “Can you complete this task?” I also ask: Did you feel respected? Did anything feel manipulative? Would you change permissions or opt out of any part of this flow?
I compare versions of flows not just on efficiency, but on how they score in terms of trust, clarity, and comfort.
- Technical Support for Values
Design decisions should be paired with technical mechanisms that enforce or protect values. If consent is a value, I might bake in revocable data access or visible toggles rather than hidden defaults. If inclusivity is a value, I ensure extensible typography scales, alt text is robust, and motion is curtailed.
- Iteration & Reflection
Values shift. Contexts evolve. What felt like a good balance six months ago might feel off today (e.g., in light of news about algorithmic bias or data breaches). I revisit value maps and audits regularly — not just when features get added.
What I’ve Learned (Good & Hard)
Over time, VSD has transformed how I see what “good design” means — but it hasn’t been easy. Here are a few lessons I’ve gathered:
Trade-offs are unavoidable. You often can’t maximize all values simultaneously. The trick is to make trade-offs visible and defensible, not hidden.
Not everyone cares equally. Some stakeholders — product leads, business teams — may prioritize growth or engagement over values like privacy. Those tensions have to be surfaced and negotiated.
It can slow you down (if you let it). My early VSD efforts felt like friction. But I’ve learned to embed value thinking in smaller iterations, so it doesn’t block progress but guides it.
You need allies. VSD is easier when you have engineers, product managers, and leadership who are aligned on value principles. If design is the only voice raising these questions, you’ll feel friction.
Looking Forward
I believe that in the next few years, we’ll start to see value-aware AI, UX 3.0, and design ecosystems that aren’t just reactive to data, but proactive in upholding values. (As some researchers suggest, UX is evolving from “user-centered” to “human-AI-centered” frameworks.)
My hope is that design education, tooling, and team cultures shift so designers don’t have to be lone moral agents — value thinking becomes a shared foundation.
In the end, I design with VSD not because it looks “ethical” in marketing pamphlets, but because I want to build systems I can live with. When technology powers our lives so intimately, our values can’t live at the margins — they must live in the infrastructure.
This content originally appeared on DEV Community and was authored by Osman Gunes Cizmeci