“Trust me, bro—your data is totally safe”: UX, privacy, and the hiring paradox

Photo: afif - stock.adobe.com

Imagine the following scenario, if you will:

An enthusiastic candidate is job-hunting in good ol’ Geneva, Switzerland.

Let’s call her Avery.

Avery stumbles across a posting that makes her do a double take. It checks all the boxes:

  • A UX role that actually matches her skills
  • A tech company she’d had her eye on for years
  • And best of all? Just a short drive from home

Could it get any better?

Actually—turns out it could.

An email lands in her inbox inviting her to move forward with a short online assessment. The planets are aligning.

Or… are they?

Avery reads the rest of the email only to find out her webcam needs to stay on for the entire assessment.

Why?

Because the third-party testing platform prevents cheating by taking snapshots of her every 30 seconds.

Yikes.

Welcome to modern hiring—where proving you’re trustworthy begins by handing over your blind trust.


Privacy theater: the (definitely not Santa) clause

One clause in the Terms of Service gave Avery serious pause. It more or less says the platform—and any affiliated companies—can do pretty much whatever they want with her data: reuse it, publish it, all without ever checking in with her again.

Like… for real?

Ma’am. This is a Wendy’s.

This is a simple job application. No one’s signing over film rights.

So… what exactly does that mean? That they can take your assessment data—including images, recordings, and results—and reuse it however they see fit, without ever asking again?

For a UX test? In a hiring context?

To make things more confusing, their privacy policy says something else entirely—offering reassurances about responsible storage, limited sharing, and GDPR protections.

So which is it?

Because from where Avery’s sitting, the discrepancy between what’s said in the policy and what’s buried in the terms is starting to feel like a major trust-breaker.

Is this even revFADP-compliant?

Under the Swiss Federal Act on Data Protection (revFADP) and GDPR, personal data must be processed lawfully, transparently, and for specific, proportionate purposes. Candidates must know:

  • Where their data is stored
  • Who it’s shared with
  • For what purpose
  • And how long it’s kept

So are we seeing any of that here?

Or is this just an all-access pass to distribute your data indefinitely unless you specifically ask them not to?

Even more troubling to Avery? The privacy policy vaguely notes that if her data is transferred to a country with weaker protections, she should rest assured that the platform has taken care of the necessary security stuff—whatever that may be.

So you know… rest assured.
You resting?
And by resting assured, you also feel reassured… don’t you?

Photo: nyul – stock.adobe.com

Is this even good UX?

Setting the compliance question aside for a moment, let’s look at this through a design lens. Because yes, this is very much a UX issue.

Why?

Trust is the heart of user experience.

That’s why.

Would you trust an app that asked you to turn on your webcam, take timed snapshots, and possibly surrender licensing rights… before you could even see what the app does?

Gotta say, I’m team Avery here.

We know that surveillance erodes user trust (HBR, 2024).
We know that dark patterns, unclear consent, and vague data rights make for questionable experience design.

So should we be okay with this in the context of hiring?

If your product demands high trust, but delivers low transparency, that’s just not a great experience.

If anything, it feels like something of a double standard.


What this says about the workplace

Hiring is the first impression your company makes on prospective employees.

It tells candidates what you value and how you operate.

And if the very first interaction demands surveillance, compliance (well—theirs at least), and unchecked consent… what happens once they’ve signed the proverbial dotted?

In other words:

Is there any particular reason Avery shouldn’t suspect this company’s culture is built on micromanagement?

And what about the universal principle of reciprocity? If you preemptively distrust people, is there any reason they shouldn’t return the favor?

If fairness is one of your company’s core values, why not start with clarityproportionality, and mutual respect… instead of fear and control?


The real test

Can this really be viewed as a test for Avery as a candidate?

Or is it only fair to also see it as a test for the companies using this tool?

Because if it is—the results weren’t great. And Avery… well she’s a tad underwhelmed.

It makes her wonder whether, somewhere between “streamlining” and “standardizing,” trust fell by the wayside. Whether data collection has been quietly conflated with fairness.

As the hiring landscape grows more automated and opaque, one thing is clear:

The tests are changing.
The tools are changing.
And if companies want the trust of tomorrow’s talent…
they’d better remember that the times, they are a-changin’.


Question for you:

How much personal data are you willing to give away… just to get your foot in the door?

Contact me to share your perspective—I’d love to hear from you!

This post was co-written with ChatGPT to refine ideas, structure arguments, and enhance clarity.