Deep learning and AI ‘cold war’ spells privacy concerns | Innovation
Fears over AI overlords may be groundless, but the use of machine learning to mine personal information is a worrying development, artificial intelligence experts have warned.
Questions over machine monitored surveillance have existed for decades, but a lack of sophisticated technology to apply machine learning theories quelled most concerns.
Recent advances in AI technology, however, have seen governments and corporations’ investing in deep learning, resulting in an unregulated “cold war,” said Dr Matthew Aylett, who works at Edinburgh University’s school for informatics and is Chief Science Officer of Edinburgh-based text to speech company Cereproc.
Top scientist Stephen Hawking recently claimed computers would overtake human intelligence within 100 years.
Dismissing this, Dr Aylett said: “To put it into perspective, take a prawn that swims around in the sea. Building a system as clever as a prawn is really hard. And we are, probably, not there yet – and we don’t agonise too much about how smart a prawn is.”
However, his concerns over deep learning, and specifically speech recognition technology, stem from a machine’s ability to predict outcomes from massive datasets.
“Given enough data sources you can find out things that people didn’t realise. Take the classic idea that if you know people’s positions on their phone, you can tell where they move about and guess where they work and where they live. It is very exciting for companies, who can sell things to you based on that.
“Think how much data Google has at its fingertips. You can see why it is very keen to use deep neural networks as an approach,” Dr Aylett said. “Google is not free because they are great guys but because it is their business model.”
Again, these worries have existed since the first theories surrounding deep learning came to light, he said.
“We talked about the privacy issues of speech recognition in the 1980s, the big difference then was that if governments tapped your phone, someone had to listen to it and that’s a lot of costly manpower. But if you can tap everyone’s phones and listen to it at the same time, then that becomes something potentially you can do.
“That’s why there is the problem with the NSA. They can intercept everyone’s communications at the same time and that, of course, raises lots of issues.”
Ethics boards
Google spent £400 million on a UK-based artificial intelligence firm called DeepMind at the end of last year.
But the startup fronted by Demis Hassabis – who was ranked the second-best chess player in the world aged just 14 – wrote a condition into its contract around the acquisition. Its founders told Google that it should appoint an ethics board to monitor how AI was used in its search capabilities.
Google claims to have created the ethics board, but has yet to reveal who is sitting on it. It also has an ‘ethics and compliance team’. Whether the board forms part of this is unclear.
Will Ramey, product manager at GPU (general processor units that fuel deep learning algorithms) manufacturer NVIDIA, believes that appropriate governance has not caught up with the technology.
He told Techworld: “I think privacy is one area to be concerned about…technology often leads and the way we use it and appropriate use and governance catches up. Right now, we are in the grand, exciting, experimental phase of what we can do.”
While Ramey said it was important that companies and researchers “push the boundaries”, he warned that the guidelines surrounding use of deep learning “are basically waiting to see what happens”.
He added: “Individuals have the right and practical ability to choose how much they want to participate in some of these experiments – that’s an area where we may be not quite doing everything that could be done.”
When asked whether NVIDIA GPUs were used for machine learning processes that assisted the NSA’s leaked mass surveillance policies, Ramey couldn’t comment.
However, he said that while the advances in technology cause privacy concerns, the public sector has always invested in innovative security techniques.
“Government prints the money. If it wanted to record every telephone conversation – and could find a way to justify it legally – it could have been doing that with tape cassettes and employing large armies of people to make it searchable.
“I don’t think technology advances changes what governments do, or want to do, but it may make it easier and more cost effective for things like that to be done.”
Not sure what deep learning is? Techworld breaks it down for you here.