There are many threats to privacy. Some systemic, some cultural, some are even well intentioned, like when a bank asks you to prove you are you to open a bank account. Seems fair, right? It’s not. The problem is trust. Trust based systems are an existential threat to privacy.
The problem is trust. Trust based systems are an existential threat to privacy.
Let’s think about this a bit. If I want to order something, subscribe to something, or generally engage in most economic and non-economic transactions in modern society, some premise of trust generally underlies. The greater the consequence of the transaction, the more trust is required. If I want to make a reservation at a restaurant I can call and give a fake name and number since the consequence of me not showing up, besides a lifetime of bad karma, is minimal. If I want to get a home loan from a bank, however, I need to divulge a large tranche of current and historical financial data in order to establish trust in my financial wherewithal. That kind of trust is called credit but it’s still trust.
There are many currencies of trust. In some cases, its time. I trust an old friend because we’ve known each other a long time and I have a sense, albeit anecdotal, for their trustworthiness. In some cases, its achievement. I may trust my doctor because he went to an ivy league medical school. In some cases, its demonstrated. Something like due diligence in the business world is a form of demonstrated trust. Do the research to show that something or someone is trustworthy. There are many others but, in all cases, trust sits at the opposite of privacy on a spectrum since the cultivation of trust requires increasing amounts of personal information.
Culturally we have come to believe that privacy is a moral failing. “Oh, that Alec is so private, he must have something to hide…” I prefer to think of privacy as less about what you are hiding and more about what you are protecting. In tech there’s a concept called “minimum access” and it applies to how access to networks and applications is allocated to users. An organization reduces risk by giving just the minimum level of access to each user so they can perform their role. Kind of like “need to know” in Gov speak. I apply this concept everywhere in my life. Admittedly it’s a bit of a sport for me at this point but rarely do we have to give over all the information that’s requested. Think of all the times you go to some office and they ask you to fill out the intake form but the friendly associate at the front desk says something like “actually all we really need is your name and for you to initial page 3 and sign page 4.” Make that your approach to everything. What it always comes down to, though, is that there is some minimum amount of information required to establish some threshold of trust to execute on a transaction. That’s the problem. Some trust is always required.
I can keep fighting my little privacy battle along with my privacy friends and we can all high five over little victories against asymmetric adversaries like “Government” and “Corporations”, but the true solution is to move toward trustless systems. Let’s look at two examples.
The first is One-Time Pad (OTP) cryptography. OTP is a symmetric cipher which, if deployed using truly, not pseudo, random keys offers Shannon level information-theoretically secure encryption. As in, its unbreakable. Unbreakable now, and unlike mathematically derived ciphers, it’s also permanently unbreakable. My friend and colleague KC always says that figuring out what day of the week it will be on April 15th, 2185 is a math problem. It may be a hard problem but with enough processing power the answer is discoverable. Asking what the weather will be like on April 15th, 2185, that’s not a math problem. We can guess, but there’s no way to know. OTP is like the latter. Given that there’s no way to break OTP, it represents a trustless system for its users. If Alice and Bob encrypt messages to each other using OTP they can post those messages of Facebook, send them via gmail, or put them on a billboard in Times Square. No one can read the message except the intended recipient. No trusted 3rd party, no rent-seeking intermediary, no broker of record. In fact, Alice doesn’t need Bob to divulge anything about himself other than the fact that he can read her message. In the case of OTP, cryptography is a privacy preserving function.
…cryptography is a privacy preserving function.
A second example of privacy preserving cryptography is Bitcoin. Bitcoin is a trustless system for recording and transferring monetary value on an open, permissionless, distributed ledger. One of the many novel advancements offered by Bitcoin is that it obviates the massive bloat of the financial services sector through trustless movement of value. If Alice wants to send Bob money, Alice doesn’t need to sign up on Venmo, give her driver’s license to some unknown database, link a bank account to prove she has funds, create a user name, find Bob’s user name, and then send the money and wait for Venmo to approve. Then, maybe 3 days later, (or less for a fee), Bob can access his funds in his bank account if he goes through the same process as Alice. Each step along the Venmo path further degrades the privacy of the parties to the transaction, not to mention those who make their transaction “public” on Venmo. Why, I can’t imagine, but when you Venmo someone at 3am for “Scooby Snacks” and set it to public we all know what you are up to. And Venmo is the easy button. Try to move more than $2,300 at a time and the privacy degradation gets even worse. Not so with Bitcoin. Over the Bitcoin blockchain Alice can move any amount she wants in about 10 minutes to any other user she wants without having any KYC, middleman, or need to trust her counterparty. The old “check’s in the mail” routine doesn’t work with Bitcoin. Every transaction is provable to any party that wishes to check. If you trust no one then Bitcoin is the bank for you.
I think we are a ways from a systemically trustless society, so in the interim I’ll still be fighting the good fight, deploying privacy measures and making life hard for intake form collectors, but at the same time, if we look for trustless options and resist the urge to otherwise overshare where we can, maybe some shifting of the imbalance between trust and privacy is possible. At the very least you shouldn’t trust me.