This article is mainly a well written and researched portrait of Shayam Sankar, the CTO of Palantir. It includes couple of interesting and funny passages on the company itself:

Palantir has by now been around so long, and been the subject of so many chin-stroking think pieces on the ethical implications of its software, that it’s a wonder what a difficult time expert observers, ordinary people, and even the company’s executives still have explaining what exactly it does and doesn’t do, which civil liberties (if any) it could plausibly be accused of breaching, and why anyone should associate technology as mind-numbingly dull as “data integration and visualization” with either violations of the U.S. Constitution or of saving Western civilization.

The problem that led to the intelligence failure of 9/11, and which still hadn’t been fixed or even really addressed five years later, was how America’s various intelligence agencies (there were 13 in 2001, and 16 by 2006) stored, shared, and analyzed the intelligence gathered by their systems and spies. The analysts’ primary software tool was called i2 Analyst Notebook, which allowed the user to create PowerPoint-like link charts nearly as primitive as the corkboard and red yarn kind used in Hollywood depictions of pre-internet municipal police stations: a picture of a guy’s face with a line pointing to another guy’s face and the words “reports to” or “travels with,” etc. Worse, the data that facilitated the link charts was stored in individual Excel files and other desktop apps that in turn were stored on over a dozen different systems and had to be manually uploaded from one system to another, which often meant analysts literally emailing each other file attachments. Worse still, the analysts generally served in their positions for only two years at a time; when their replacement arrived, they would have to make sense of Excel tables that essentially represented the idiosyncratic thought processes of a different person. An example of this problem, and how long it continued to persist even after the catastrophic intelligence failures of 9/11, was the 2009 bombing attempt of Northwest Airlines Flight 253 by Umar Farouk Abdulmutallab, aka the underwear bomber. Roughly five weeks before the attack, Abdulmutallab’s father personally visited the U.S. Embassy in Abuja, Nigeria to warn the officers there that his son had become a religious extremist, was potentially planning something against the United States, and was likely in Yemen. The father’s tip was logged into the U.S. National Counterterrorism Center’s central database but was never cross-referenced with any other available intelligence, such as NSA intercepts confirming Abdulmutallab’s presence in Yemen, the fact that the UK had recently rejected his visa request, or that the CIA was aware of deranged anti-American posts he’d been making on social media.

Thus, do not trust your institutions blindly. Hold them to high standards, verify, and then, cautiously, trust.

If you feel like responding …

Tell me what you think about Trust in absence of verification is naïveté (institutional version) via Signal.

Go to chat …

This is designed as an intentional way to interact, without ads, tracking or constraints on content. You can even send a photo or a voice memo -.-.