Spy agencies must be transparent about new data crunching, analyst warns
Privacy and basic rights are at risk of being compromised in the process of data collection
OTTAWA – Rapid technological advances in data collection and analysis are transforming the way spy agencies work, potentially putting civil liberties at risk, an Israeli intelligence expert has warned the Canadian security community.
The organizations responsible for keeping people safe must ensure privacy and basic rights are not compromised in the process or chance losing public faith, Shay Hershkovitz said in a presentation to the Canadian Association for Security and Intelligence Studies.
Spycraft is being revolutionized by the growing number of smart devices, almost-unlimited data storage and the advent of artificial intelligence, Hershkovitz told the association’s recent annual conference in Ottawa.
“Transparency will be key here, and legislators will have to limit the use of these technologies,” he said.
“If intelligence agencies will not ask these questions and will not lead the public debate, they will be dragged into it kicking and screaming, and everyone will suffer and lose.”
Hershkovitz, a senior research fellow and former intelligence officer in Israel, attended the conference at the Canadian War Museum, though a sudden illness meant the gathering of security officials and academics saw a pre-recorded, multimedia presentation of his ideas about the future of one of the world’s oldest professions.
“If we really want to learn what intelligence will look like, we must look outside the national-security establishment – that is, we should explore not only what governments are doing but, more important, what is happening in the private sector and in academia,” he said.
By next year, some 50 billion devices will be connected to the internet, growing to 100 billion devices by 2025, said Hershkovitz, head of research at the XPRIZE Foundation, a non-profit organization in California that manages public competitions intended to encourage beneficial technologies.
“The inevitable conclusion is that in the near future, in about five years from now, information will be spewing from every street, every car, every house and even from the sky.”
The price of data storage, meanwhile, is falling steadily. The cost of storing one gigabyte of data in 1980 was about half a million dollars, but just two cents today, he said.
At the same time, the flood of data will only speed up the development of artificial intelligence, Hershkovitz predicted.
Intelligence agencies have traditionally made decisions to collect information about specific people and groups, taking away resources that could have been used to monitor other targets, he said. Now they can collect and sort information on a massive scale and decide later what information already in hand is most relevant.
Agencies will have to decide what information to store, and for how long, and analysts will need to work side-by-side with computers to sift the huge amounts of data, Hershkovitz added.
Revelations in recent years by former U.S. spy contractor Edward Snowden about widespread surveillance of communications created public awareness about the privacy risks of digital technologies and society’s increasing reliance on them.
Newly enacted security legislation recognizes the burgeoning role of big data, requiring the Canadian Security Intelligence Service to seek a judge’s permission to keep datasets that primarily contain personal information about Canadians.
During a conference panel discussion, engineer and lawyer Samuel Witherspoon emphasized the continuing need for humans to help make sense of such information.
Key decisions, possibly involving life or death, can’t simply be left to algorithms, said Witherspoon, co-founder of IMRSV Data Labs Inc., which is teaching computers to read, hear and see. “I think that’s an incredibly problematic approach.”
The intelligence community will have to grapple with the necessary restraints as storing vast amounts of data becomes even less expensive in coming years, said Benoit Hamelin, who has worked as a developer, researcher and manager at start-up companies involved in cyberdefence and threat detection.
“Of course there are ethical implications,” he said. “We have to set out an ethical framework.”