Most companies tell you what they know about you. Most privacy policies are lists of data collected about you.
We're going to do something different. We're going to tell you what we don't know about you.
Because in a world where hacks, breaches and data leaks are common, what a company doesn't collect matters more than what they promise not to share.
We don't know your name.
During verification, we confirm you're a real human. We don't need your name to do this. We don't store it. We don't want it.
We don't know your email address or phone number.
These are the most commonly-used identifiers for tracking you across the internet. It seems benign: "This company may need to contact me. I should give them a way to do so." But your contact info is also the easiest, most convenient way to track you.
We don't know where you've been.
We don't track which platforms you use, which websites you visit, or which communities you join. Your not.bot is yours to use as you wish, without being tracked.
We don't know who you talk to.
We don't ask for your contacts, and we don't track who you interact with. Your social graph is invisible to us. We verify you as human. What you do with that verification is your business.
We don't know your address.
Your physical location is irrelevant to proving you're human. We don't collect it, infer it, or track it.
We don't know your face.
Many verification systems rely on facial recognition or other biometrics. We don't. Your face is yours.
We don't know your browsing history.
We don't use tracking cookies. We don't buy data from brokers. We don't piece together your interests from your online behavior.
We don't know which alias is which.
If you use multiple aliases, we can't tell which one you're using where. That's by design. The proper term is unlinkability and we take it seriously.
The technology that makes this possible is called multiparty computation (MPC).
MPC is a type of privacy-preserving cryptography that lets two computers, like your phone and our servers, work together to accomplish a common goal, like verifying that you're human, without sharing information that each computer keeps secret.
Example: I can prove I'm over 21 without showing you my driver's license (which reveals my exact age, my address, my full name, my photo).
not.bot uses this approach throughout:
The math doesn't require trust. It's verifiable. It's not "trust us, we won't look." It's "we mathematically cannot look."
Most tech companies optimize for data collection. More data = more valuable ads = more revenue.
This creates a structural incentive to know everything about you. Even "privacy-focused" companies often collect data "for improvement" or "security" that they could choose not to collect.
We built not.bot without advertising. Without data brokering. Our revenue model and market differentiation actually depends on not knowing who you are.
For transparency, here's what we do need:
We know a unique human exists.
This is the core function. We need to confirm that a not.bot maps to a real, unique person. For now, we have you scan the NFC chip in your passport. More ways of verifying that you're a unique person are planned.
We know when verification occurred.
Timestamps for the credential issuance are necessary for validity periods.
That's it. The minimal information necessary for verification to work, and nothing more.
The conversation around privacy usually focuses on what companies promise not to do with your data.
We think the conversation should shift to what companies choose not to collect in the first place.
Minimization > Promises.
Architecture > Policy.
Cryptographic impossibility > Contractual commitment.
This is what privacy looks like when it's built in from the start—not bolted on as an afterthought.
What we don't know about you is the point. Not because we're careless. Because we designed it that way.
Verification without surveillance. Proof without exposure. Trust without data.
That's not.bot.