Christina Trevanion opened up about being targeted by a deepfake (Image: BBC)
BBC expert Christina Trevanion was visibly shaken during her appearance on Morning Live as she recounted the harrowing experience of being targeted by a deepfake. The Flog It! star, 43, detailed the disturbing nature of the artificial intelligence-driven abuse. Deepfakes, which utilise AI to alter images or likenesses of individuals, often for pornographic content, were used against Christina, who courageously shared her ordeal.
“I’m used to living life in the public eye. Often their reaction from the public has been kind and sweet and supportive but over the last couple of years there’s been a noticeable shift and at times it can be quite intrusive.” Christina continued: “Last September I discovered my image had been used to create phony explicit videos known as deepfake porn. I was sent a very long list of sensitive urls where my head had been AI-ed onto pornographic videos and images.” She continued: “As it sunk in, it was deeply distressing. I felt naive, and stupid and utterly violated in every single way.”
…
Christina Trevanion is best known for her appearances on Bargain Hunt and Flog It! (Image: BBC)
Another individual, pseudonymously named Jodie, also fell prey to this technology in 2021 when she received a link to deepfake images and videos that portrayed her in sexual scenarios with multiple men.
Jodie spoke out about the profound effect this invasion had on her, saying, “I just felt like my whole world shattered around me, I felt that if someone saw these images, they looked very real and they might think they are real.
“I felt my relationship might be on the line. Friends and family too, it really did feel that this could ruin my life. When I did find out who was behind them I was completely in shock, I was feeling suicidal. This is a matter of male violence against women and we need to make sure victims have the option of having their images removed, ultimately that’s what many victims want.”
Under the pseudonym Christine, another highlighted the inadequacies of UK laws which currently make it illegal to share or threaten to share intimate images, but fail to cover the creation of deepfakes. Baroness Charlotte Owen is spearheading a campaign to close these loopholes with proposed changes to existing laws.
Don’t miss…
Christina said she felt “utterly violated” in every way (Image: BBC)
She expressed frustration at the reluctance of authorities to address this emerging type of abuse, stating: “The law commission report from 2022, they were less sure if the creation of these images were serious enough to criminalise because they thought if someone doesn’t know about it, there’s no harm caused. This is something I found to be totally wrong.
“The bottom line should be if a woman does not consent, that should be enough. (Her bill) covers the non-consensual taking, creation and solicitation to create sexually explicit images and video.
“I’m hoping it sets a marker down that it is an act of abuse. If they keep the legislation as the lords passed it, (punishment) would be fine and prison as an option.”
In the past few months, Christine has managed to get the majority of her deepfake content taken down, yet the ordeal still haunts her. She expressed: “It’s something I will always have hanging over me and other victims. Seeing your own image being used without your consent feels like you’re being robbed of your freewill.”