Australia, We Need To Have An Urgent Chat About Surveillance Technology And The Race For Marginal Votes
By Mel McCartney at Political Omniscience
Businesses that make money by collecting and selling detailed records of private lives were once plainly described as "surveillance companies." Their rebranding as "social media" is the most successful deception since the Department of War became the Department of Defense.
— Edward Snowden (@Snowden) March 17, 2018
We live in a time where often it’s only a handful of seats that can win an election, instead of election campaigns focusing on ideas and policies, the focus is now on influencing voters to win marginal seats. Data mining and surveillance tools that were created for war zones before being picked up by the advertising industry are now in use against civilians. Privacy laws and regulations haven’t kept up with the pace of technology, and too many people are willing to exploit these loopholes. Business models that involve data mining like Cambridge Analytica and i360 can’t exist without tech giants such as Google, Facebook, and Palantir.
Everyone is scraping, selling and analysing our data it’s now about influencing our behaviour, predicting what we will do next. The co-founder of Palantir, Peter Thiel, wasn’t joking when he named his company after the Lord of The Rings stone that is ‘far-seeing’. Technocrats take advantage of the fact that most of us aren’t technologically minded, especially the politicians that they pitch their technology too. This is the main-game now to try to see into the future with predictive technology, including predicting crime. In this article, I will provide background about the tech giants and how they adopt each other’s technologies, how data is being used against marginal voters with the debut of i360 software in Australia, and what looms on the horizon. Firstly we will take a look at Palantir and its ties and similarities to Facebook.
What is Palantir and how is it connected to Facebook?
Peter Thiel co-founded PayPal and sold it to eBay in 2002. He took advantage of a post-9/11 world and created a company that used PayPal’s fraud-recognition software to stop terrorist attacks. This company became Palantir and was founded in 2003. Thiel provided the seed money for Facebook in 2004 and joined Facebook’s board, he is still on the board today. Mark Zuckerberg was only 19-years old at the time, Thiel soon became a mentor and friend. Zuckerberg bragged at the time about how he was able to collect data willingly from people, he called them “dumb f***s”.
Initially, the only investor of Palantir was Thiel but in 2005 the CIA’s venture arm In-Q-Tel also became an investor. The CIA was the only customer of Palantir for the next three-years. They had the best engineers working for the company as they tested and evaluated their data mining software to predict terror attacks. Word soon got around and before long they were getting contracts from intelligence agencies, military units, and eventually police departments. Police departments in America were interested in using the technology for its ability to analyse big data to predict crime.
Predictive crime technology goes mainstream, including in Australia
Palantir secretly began using the population of New Orleans to test its predictive crime technology in 2012 by offering its services for free. In return, they were given free data relating to public records, court filings, licenses, addresses, phone numbers, social media, and non-criminal data to train its software to forecast crime. Palantir’s prediction model also used an intelligence technique called social network analysis (SNA), which looks for connections between people, places, cars, weapons, addresses, social media posts and any data stored in databases.
This information is analysed and used to predict if people are most likely to be the victim of a crime or to commit a crime based on their connection to known victims or criminals. It’s frighteningly close to the TV series ‘Person of Interest’, there are so many things that can go wrong here:
-
A culture in law enforcement of guilt by association.
-
The profiling of citizens being analysed by an algorithm with no human oversight.
-
Privacy issues.