I was 13 the first time I saw a stranger’s penis. It was 2005 and I was on MSN Messenger after school. A user added me as a contact and requested to share their webcam. Being young and trusting, I accepted and subsequently saw a man masturbating live on my screen.

Emma Ferguson-Law

Emma Ferguson-Law

I remember feeling confused, shocked and completely ashamed. My heart was racing with fear and guilt. I couldn’t tell my parents because they would stop me from using MSN, which to a millennial teenager meant saying goodbye to a social life. I also thought that I had done something wrong in accepting the request and so blamed myself.

If that had happened to me in the ‘real world’, I could have reported the perpetrator to the police and upon conviction he would have faced up to 10 years in prison for engaging in sexual activity in the presence of a child. Had I been over 16, he would have faced two years in prison for indecent exposure. Because it was online, no such crimes existed and the next couple of decades would reveal to me that this is the norm for women and girls online. It was certainly not the last time that I would be subjected to an unsolicited image of a penis on my phone or computer with no legal protection available.

Finally, in 2023, we got the Online Safety Act. This introduced the offence of ‘cyberflashing’ or sending an explicit image for the purpose of sexual gratification or to cause the recipient humiliation, alarm or distress. This is welcome but two decades too late for me and my peers.

The law is not keeping up with the development of technology. Just as it catches up with the issue of unsolicited ‘dick pics’, new crimes are emerging in the world of virtual reality, known as the metaverse, and there is no legislation to deal with it. We are already too late. Again.

Police are currently investigating the gang-rape of a young girl whose avatar was attacked by a group of adult men while playing a Meta video game. Police have explained that the psychological trauma of the virtual rape was similar to that of a physical attack.

Women have been harassed online and in gaming for as long as that world has existed, but these games are now very immersive; it is no longer abusive text on a screen. An attack such as this might happen on a computer, but because of VR headsets, it is happening right in front of your face and you are surrounded by the sound. You can hear people’s voices in your ear, and the voices could be louder the closer their avatar is to yours. In some games, your avatar might be unable to escape if being blocked in by other avatars. It could feel incredibly real.

It is not clear how police can investigate or bring charges for this crime based on current legislation. There may be jurisdictional difficulties given the global nature of the online world, and evidential difficulties because games are not routinely recorded unless someone takes a screen recording. Then there is the question of how we classify this crime. Criminal law requires physical contact before someone can be charged with rape, and while it has been suggested that this could be classed as the creation of synthetic child abuse images, I think that is a stretch. Yet again, as in 2005, there is no such crime for attacks that children are experiencing online.

The Online Safety Act requires tech companies to promote online safety by tackling illegal material and content that is harmful to children, conducting regular risk assessments and properly enforcing age limits. However, I am not confident that tech companies will make the necessary changes. For example, Meta’s response to the reported rape – ‘we have automatic protection called personal boundary, which [keeps] people you don’t know a few feet away from you’ – is weak. We know that most crimes of violence against women are committed by someone they know. Further, the harms of social media to children have been known and studied extensively for years, yet nothing concrete has been done by tech companies to prevent them.

We await Ofcom guidance about what tech companies will need to do. The act allows Ofcom to enforce a number of sanctions against tech companies which do not comply, but this is likely to be generic at first and largely reactive to issues as and when they arise.

There are also potential difficulties around sanctioning individual users on gaming platforms who do not comply with current rules and standards. Developers and publishers may be reluctant to ban users who have a large following for fear of backlash from the gaming community. A strong community is crucial to the success of a game so this could be a disincentive to crack down on these issues.

Whatever the next steps are, it is clear that they should have already happened. What is needed here, and quickly, is new legislation. Technology is only getting more sophisticated and we need to pre-empt the challenges we are likely to face in keeping people, particularly women and children, safe online.

My answer would be to get children and teenagers into the room. They are on the ground, they know what is going on in the online and gaming world. Their insight would prove to be invaluable.

 

Emma Ferguson-Law is a senior solicitor in the abuse claims team at Bolt Burdon Kemp, London