Digital systems rarely feel oppressive — but many quietly enforce hierarchies we’re not supposed to notice. From sign-up flows to document uploads, we’re often nudged into paths we didn’t choose, governed by rules we didn’t write, for outcomes we didn’t fully understand. And because it all looks so clean, so “user friendly”, we rarely notice what’s missing — let alone who might be missing out.

Interfaces encode hierarchy

Not all users are created equal, at least not from a system’s perspective. Defaults are set based on assumed behaviours. 

Permissions reflect pre-approved roles. Drop-downs are arranged with someone’s convenience in mind. 

These decisions might seem small, but together they create invisible ladders: access for some, friction for others. And when the system fails, error messages rarely blame the architecture — they blame the user.

Language also acts as a gatekeeper. Instructions that rely on domain knowledge or insider vocabulary exclude without ever needing to say “no”. Even good-faith attempts at simplicity can betray unspoken assumptions about who the “real” user is, and who the system wasn’t really built for.

Gatekeeping hides behind usability

Sometimes, what looks like friction is actually a filter. Think of the user asked to upload “supporting documentation” without being told what that means. A confident, tech-savvy user might guess the format, rename their file, and sail through. Someone less familiar with digital conventions might miss a step, then miss a deadline. 

The outcome isn’t a matter of effort or value. It’s a matter of decoding the system.

This isn’t always intentional. Designers often face a genuine dilemma: over-explain and you bore the confident user; under-explain and you alienate the cautious one. 

But the problem isn’t complexity — it’s where the cost of misunderstanding falls. Too often, those with the least confidence are asked to jump through the most hoops.

Neutrality is never neutral

There’s a seductive myth in system design: that if we just use clean lines, formal logic, and consistent rules, we can eliminate bias. But even the most elegant platform reflects values. Whose identity gets verified? Whose data gets profiled? Who decides what counts as “normal” use?

When platforms insist they are neutral, what they’re really saying is: “Trust us not to misuse the power we’ve quietly embedded.” And trust, as we know, is earned through transparency, not withheld behind terms and conditions.

Accessibility brings this tension into focus. If your system only works well for those with fast broadband, fluent English, or neurotypical cognition, then it’s not neutral — it’s selective. And that selectivity, masked by technical polish, becomes a subtle form of exclusion.

Conclusion: Power is designed

Power in digital systems doesn’t shout. It nudges, defaults, and times out. It appears logical while enforcing norms. It looks clean while encoding values. And it often goes unnoticed until someone hits a wall they didn’t see coming.

As a product person, I believe in design as a force for good. That means noticing when rules serve the system more than the user, and having the courage to ask why. The more invisible the power, the more important it is to trace its wiring.

Because systems don’t just reflect the world, they shape it. And shaping wisely begins with seeing clearly.

Leave a Reply

Your email address will not be published. Required fields are marked *