The problem is trust
I first wrote this in a letter to my brother, but I decided to share it more broadly. I'll try to describe the problem without devolving into flowery language, but forgive me if I stray from pragmatism a little.
With this past election I've been pondering the problem of trust. The internet (particularly social media) rushed us into an age where everyone simultaneously has a megaphone and headphones. Stories are shared left and right without any clear confirmation.
Case in point is the DAPL. I constantly see chatter back and forth about whether the tribe attended the zoning meetings, whether they were just greedy, whether protesters were funded by train companies, etc. With the plethora of news sites starting up, shutting down, and starting up again, it's hard to tell who's legit.
In a search for a solution, I'm thinking about a few known problems and approaches.
In cybersecurity we use a chain of trust. Web sites have certificates issued by a "root" certificate authority (e.g., verisign). My PGP public key is on my Facebook profile, so I can sign my emails appropriately with a key that folks can verify (using Facebook as my root of trust). CNN uses a signed certificate (and the notoriety of the name “CNN”) so I can have a reasonable degree of confidence that when I’m reading a story on CNN.com, it’s actually from CNN and not from someone running a fake cnn.com.
As I’ve tried to market my book, I’ve learned that marketing is not as easy as I imagined it to be. It’s really hard to get an idea into people’s heads. Even folks that I’ve been friends with on Facebook for a long time have little to no idea that I wrote a book. Being able to expose an idea to a lot of people is immensely valuable, and I’m just starting to appreciate that. Getting people to do something (e.g., installing a browser plugin to verify the sources of documents they read) is even harder and close to impossible.
Throughout the election there were tons of organizations (NPR, factcheck.org, etc) vetting claims and accusations. Shortly after the election and news of Facebook spreading false stories, some students came up with a fancy algorithm for verifying articles. Unfortunately none of this seems to have made any difference. Message board are still filled with back-and-forth unverified garbage.
What to do?
I think this situation demands some kind of dramatic shift in how we approach news media and information in general. I think the shift needs to be subtle and cultural. The answer isn’t just to provide a technological solution, it needs to change how non-techy people look at information.
I'm far from a solution, but here are my approaches to this problem:
1. Make digital signing the norm, and make it obvious. People are starting to get used to seeing the little lock icon telling them their browser connection is secure. I want every news article to be signed by its author, and again by its editor. If possible I’d like each fact in the article to be signed and referenced with its source.
2. Push conflicting views on people. Sometimes I’m a little worried that the news I read is so consistently and adamantly anti-Trump. Even NPR is pretty soundly critical. It makes me scared that I’ve unwittingly boxed myself into a liberal echo chamber.
How to do it?
Finally we arrive at the golden ticket to solve this mess. Or rather, where the golden ticket would be if I had one. I’ve done my best to articulate the need, but that’s far from enough. I don’t have a clear idea of the how, but here are some off-the-wall ideas:
Create a new HTML equivalent supporting embedded references and/or citations
Perhaps a "VHTML" (Verifiable Hypertext Markup Language) extension of HTML that allows embedding references and proof of authorship inside text. Browsers could then display non-verified or limited verification text in a style different from verifiable sources.
Produce a new (Google News style) aggregator that aims to provide conflicting views on everything.
Create a web crawler to construct a “chain of trust” graph for news verification.
Create a “badge” or “widget” that can be installed on a web page (kinda like captcha) that indicates the human author of the content signed off on it and directs to sources.
If you read this far, thanks! I’m curious what you think.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1
-----END PGP SIGNATURE-----
How to verify this post
Unfortunately this is not nearly as easy as I'd like it to be.
- Install gpg tools or an equivalent gpg package.
- Copy my public key from Facebook to a plain text file called tyler.asc
- Import my public key
gpg --import tyler.asc
- Copy the PGP portion of the post to a plain text file.
- Verify the signature with GPG:
MacBook-Air:Documents tyler$ gpg --verify testcopied.txt
gpg: Signature made Sun Dec 4 11:48:37 2016 CST using RSA key ID A30D1C3B
gpg: Good signature from "Tyler Smith (Personal GPG Key) <firstname.lastname@example.org>"