Businessweek has a feature out on the data-mining company Palantir. The piece opens with the dramatic story of a “foreign national named Mike Fikri” who engaged in a string of transactions that, taken together, sound very suspicious — making large withdrawals from a Russian bank, repeated phone calls to Syria, solo visits to Disneyland, and renting a large truck. But, software made by Palantir connects the dots about Fikri, alerts the authorities, and his simmering terrorist plot is foiled.
Only thing is, this story is completely fictional. And that speaks volumes.
We humans are story-driven beings, and a compelling narrative like this can be more powerful than any amount of facts, figures and logic.
But, as Businessweek reveals after breathlessly relating the “Mike Fikri” story in great detail, the whole narrative is just a hypothetical used by the company to sell its product — completely fictional, right down to the made-up name Mike Fikri. But Palantir has been thriving, thanks mostly to the government, which is trying to make Palantir’s story — or something like it — come true.
It’s a compelling story, but there are lots of reasons to believe it will always be filed under fiction. A major National Research Council report and other experts examining the question have all concluded that pattern-based data-mining — in which suspicious patterns of activity are flagged, cold, by computer algorithms — is very unlikely to be effective against terrorism.
Speaking of fiction, Palantir is named after a device in Lord of the Rings that allows characters to omnisciently see anything, anywhere. Ironically, in the novels, those who peer into a Palantir are often deceived by what they see. But Businessweek quotes the company’s CEO as saying they believe their mission is to “protect the Shire.”
When we at the ACLU hear “we need lots of amazing new powers in order to protect you,” our response is, “watch out.”
Now, there’s no problem with the government getting better at analyzing its own legitimately collected and stored terrorism-related intelligence. That’s what we want the government to do, and what the government should do, and should have done to thwart attackers like the 9/11 hijackers and the “underwear bomber” Umar Farouk Abdulmutallab.
But the key word is “legitimately.” The problem comes when the government starts throwing in masses of information about the activities of innocent Americans. Unfortunately, we know that since 9/11, our security agencies have been irresistibly drawn towards mass surveillance as a principal strategy in the so-called “war on terror.” Instead of working outward from known leads and actual evidence based on individualized suspicion, they have turned towards the misguided strategy of sifting through millions of innocent people’s communications and activities — boiling the ocean in the hopes of finding what is, essentially, a freak occurrence: somebody plotting an attack. (Unsurprisingly, the Government Accountability Office has found that DHS has not properly evaluated the effectiveness of its data-mining systems.)
Businessweek also passes along government anecdotes about how this technology “helped” with various law enforcement successes. The government usually issues such stories to support privacy-invading technologies, but as always we have no way to verify them — or what “helped” really means, and whether it boils down to, “the success would have happened even without the privacy-invading tool, but the tool did play some minor part.”
Of course, in the end it’s not Palantir’s decision what data sets security agencies might plug into its software. We don’t know the degree of entanglement between the company and the agencies in terms of how the software is operated. And depending on the details of how it’s used, its deployment could be anything between a good, efficient use of government resources, and a true totalitarian nightmare, monitoring the activities of innocent Americans on a mass scale/collecting the records of those activities and leaving them open for suspicionless exploration by government analysts. Unfortunately, everything we know suggests that it is likely to be closer to the latter.
Learn more about data-mining: Sign up for breaking news alerts, follow us on Twitter, and like us on Facebook.