Selling Trust

Behind the scenes the internet is a giant trust-machine. To separate the good from the bad websites, the useful from the harmful software, an entire industry has been established that specialises in selling small text files to individuals and organizations so that they can sign their digital products. Operating systems are complicit in this, as they force people to buy and use such certificates.

Posted in Journal on March 13, 2020

Selling Trust

On the internet, nobody knows you’re a dog, as the saying goes. Except, of course, Google and Facebook, who watch our every step out in the wild. Still, Google presents us images of crosswalks or traffic lights, as if there was some remaining doubt about your non-dogness. If data is the new gold, data quality is the measurement of carat in the digital era. Data startups around the world do their best to collect high-quality datasets and sell them to companies thirsty to boost consumption in the population.

When these companies pay to use your data — sold by third party vendors you have never even heard of — they don’t pay so much for the results of their efforts, but rather because they trust these companies to provide high-quality data to advertise with. A dataset is not worth a penny if there is no security as to whether or not the records point to real people or dogs.

A minor inconvenience we have come to accept in our digital lives, trust remains at the core of our every interaction on the internet. Whenever we visit a page, we have a certain heuristic to determine the level of trust we are willing to give to these pages. Nevertheless, as scammers get better at making us think that PayPal did in fact ask us for our password, we cannot solely rely on our own wits to assert something trustworthy.

One area where this is extremely visible is software. We all had our experiences with trojans, viruses and malware in the past. So how do you ensure the software you just downloaded is trustworthy? The solution the big two operating system vendors came up with as far back as 2012 was to force developers to code sign their applications with a custom issued certificate, making sure that the app is in fact from you, and not your hacking dog who likes to scam people. While in general a good idea, there were, of course, caveats. First: How do you trust a certificate? And second: Where do you get such a certificate in the first place?

Certificates are being trusted because they form a kind of “original blockchain”, so to speak. There is a limited number of so-called root certificates from which all other certificates are derived — as long as the chain is unbroken, even a twentieth-grade certificate can be verified. These root certificates have been issued to a handful of companies. Then, the signatures of these root certificates were bundled with everything we use today: Browsers and operating systems have a list of these root certificates and this means both can check the signature of an app you download to see if it matches one of these root certificates. If it does, the browser or operating system knows it can trust the software, because the certificate companies enjoy the trust of Microsoft and Apple, Mozilla and Google.

In fact, there is a standard of how such a certificate company should verify the identity of someone ordering such a certificate, including the verification of official documents, using phone calls to make sure the person really is who they say they are. However, there is no legislation demanding this. It is purely a measure of trust. And this is something these companies sell for high prices. After all, if you order such a certificate, you basically pay for them trusting you. The costs of actually creating a certificate are incredibly low — it only requires to insert some data into a form and clicking “submit”. Period. Yet, these companies go up to prices as high as $500 just for one year of trust. You heard right: such a certificate expires after one year. You have to continuously pay these companies just so that they keep trusting you. It’s like feeding a baby only to keep it quiet, which, I assume, is not good parenting.

“Don’t buy such a certificate then”, one might be tempted to say now. But there’s simply no choice in not doing this. To a certain extent, if you’re developing software, you have to play by the rules imposed upon you. And both Windows and macOS are pretty good at gate keeping (Apple’s firewall is even called “GateKeeper”). If your software is not signed, users will be prompted with scary warnings such as “This file is not commonly downloaded” (Firefox) or “This software may harm your computer” (Windows SmartScreen filter). It is still possible to install the software. However, especially software aimed at non-technical users will simply not be installed, precisely due to these warnings. Why should you install something by someone you haven’t even met in person, when your operating system tells you not to?

So in order to avoid these warnings, you’ll have to pay some company in the United States to issue a certificate to you that you can then use to keep users’ computers quiet about your software. But it doesn’t stop there: You not only have to pay them for verifying your identity, you also have to prove your identity. And this can prove difficult, especially if you come from a country where it’s not common to include your mobile number on bills. When I ordered a code signing certificate, I had to realize that none of my bills included my phone number. So currently I’m waiting for the company to verify whether or not my phone actually belongs to me. And all of this for a 10kb text file.

The whole certificate industry seems to resemble a giant pyramid scheme where a few people are at the top selling trust down to the bottom and rule over the identities of people on the net. And all of this simply because a few companies (the CA/Browser Forum) decided on requiring this. Trust certainly does not trickle down easily. But money rushes up.


Back to overview