The Sunk Cost of Mistrust
A lot of our institutions rely on humans' innate ability to lie, cheat, deceive and lack trust for their own kin. While on the first hand this might appear bad, inhuman or simply just not right, that's the way we are and have been for ages.
The classic way of solving these trust issues has been to build fortresses — real or virtual, as a way to restricting the movement of the party suspected to be malicious. This has been a backbone of all things security — doors, locks, safes and later on, cryptography, two-factor authentication and biometric scans. It would hence be not very off to say that a lot of people have been able to make a living simply because humans don't trust each other. Pretty much the entire security economy thrives on our inability to not trust each other. Yet there's a huge sunk cost of broken systems that can't be repaired, and it's quite a lot. The cost of setting up these systems, the cost of data loss in breaches through unreported backdoors, and the research going into building harder-to-crack systems will easily run into billions of dollars of pure sunk cost — money that just can't ever be recovered.
Is there an end to this cat-and-mouse game? Theoretically speaking, no. Even something as amazing as 256-bit encryption can theoretically be cracked, although cracking it well outside the realm of modern computing. But that's not the point.
What I'm trying to say is that if you somehow ditch me into telling you my password and steal my phone, no amount of high-tech security could prevent you from stealing sensitive information from me. No matter how hard colleges tried, students found a way around the anti-cheat systems in exams. Even though banks are constantly evolving themselves on security measures, there's an unprecedented rise in cyber crimes especially among the aged population, which compelled the RBI to campaign to remain safe from such frauds. Technologies by Denuvo made pirating games very difficult, but they were under the radar for destroying gaming performance in most machines. Thus, it would be safe to say that these anti-cheat systems only make cheating difficult, not impossible.
It's like saying a very large number can be considered infinity. That's the typical "engineer" way of looking at things. Any mathematician would tell you infinity is much more than a number — it's an abstraction, a concept which suggests that there lies something beyond the largest number we can ever conceive. In a medical analogy, these are band-aid solutions which don't check in on the root of the problem — why do people don't trust each other.
Also, for quite some time, we've been approaching this trust problem mostly from one angle, while clearly there are two.
Instead of only pulling people out and making it difficult for them to cheat, how about incentivizing people to not cheat and reward them for showing trust? We know that incentives are pretty powerful a force for getting things done, and we've just scratched the surface when it comes to understanding the origins of human behaviour.
Religion does a great job at this. Some villages in South India claim to have an amazingly low crime rate because of well-placed religious incentives. And if Yuval Noah Harari is correct about the modern legislature being yet another sophisticated religion, we might as well improve it to promote trust among each other. That could help a lot with the second part of this trust equation, which I find is often overlooked.
Speaking of cryptocurrencies, security-wise, they're yet another solution in the direction of preventing people from doing bad. But bad guys might as well find a way to game the system (just like what happened with fiat currency) unless we don't use the power of the right incentives.
People should be made aware of win-win games. What cheaters play is a zero-sum game in the short-run (automating spam to jam inboxes and form fields / DDoS attacks), but could even turn out to be lose-lose games in the long run (annoying captchas and stricter forms of security prevention, inconvenience to everyone).
Netflix has done a good job at solving the piracy issue. Do you really want to prevent people from watching pirated movies? How about making movie watching convenient at a small fee?
Spotify did something even better. Want to prevent people downloading free MP3s from shady websites? Own the record-labelling market and buy rights to podcasts and make people believe in the importance of your mission to empower musicians. Provide convenience to the customer at a small fee.
('ll cover more such platforms in detail in a future essay)
While this certainly might be a problem that can't fully be solved. Through one perspective, all that we do, lawfully or unlawfully, is legal in nature. We can do criminal activities because the rules laid out in stone for a land don't apply to its physical nature. In a way, social justice is a shared imagination, and these shared imaginations become the reality of the collective. The only proper "laws" that we have with us are the laws of nature — laws of motion penned down by Newton and Kepler or Faraday's law of electromagnetic induction, and all the other physical laws that we know and which won't deviate by much even if they are challenged. Those natural phenomena will always be there, the way we perceive is all that could change.
Another not-so-widely-spoken law of nature is that human behaviour can be manipulated by the right environment and the right incentives. Product Managers, UX Researchers at companies like Facebook and Instagram, and even traditional salespeople know this well and that's how they're able to convince people into buying things they didn't even need in the first place!
And that's a train of thought on how psychology could help us with cybersecurity. Note that the solution to the problem won't be easy; all that I've tried to do in this article is state the problem and look at another way of solving it. And it goes without saying that the leaps we've made to build trust-less systems such as DeFi will prove very important in developing the social fabric of our society in the years to come. It's just that if we push hard from the other side, we can double our speed of progress in the prevention of crime in general.
The problem, thus becomes: how do you incentivize people to be trustworthy?
#product- 21 toasts