A researcher from Newcastle University is to tackle the subject of ‘deep fakes’ – the latest manifestation of ‘fake news’ – in an upcoming public lecture.
While fake news may refer to false or misleading statements we read or hear, deep fakes take this process further.
Deep fakes use artificial intelligence (AI) to convincingly simulate people’s voices, likenesses and movements, usually for malicious purposes. It can be hard to distinguish a deep fake video from genuine footage of a person.
Deep fake videos of Donald Trump and Facebook chief Mark Zuckerberg have already appeared. Deep fakes could also be used to generate fake but convincing revenge pornography.
The emergence of deep fakes raises questions about whether such technology breaks the law and what technological tools and legal methods might be used to combat deep fakes.
Such issues will be explored by Lilian Edwards – a professor of law, innovation and society at Newcastle Law School – in an Alan Turing Institute lecture at the Barbican in London. Her lecture will be entitled ‘Regulating Unreality’.
Professor Edwards said, “As with all technological advances, AI can be used for good, but it can also be abused by criminals and those who want to deceive and do harm.”
“Without proper scrutiny and regulation, we could slide into a situation where, as with fake news, it is difficult for ordinary people to distinguish what is synthetic from what is real.”
Concerns over fake news may expand from worries about the accuracy of the news we consume to doubts about a wide range of things we see, hear and experience.
Deep fakes could have an impact on numerous areas of the law, including intellectual property rights, defamation, anti-consumer scams and the evidence used in court.
The ‘Regulating Unreality’ lecture will be delivered at 6.30 pm on Thursday 11th July at the Barbican, London.
To book tickets – which are priced at £4 – please go to https://www.barbican.org.uk/whats-on/2019/event/regulating-unreality.
(Featured image courtesy of Christoph Scholz, from Flickr Creative Commons.)