Safety Tech 2021: Unleashing the potential for a safer internet transcript

Speakers: Suki Fuller, Analytical Storyteller – (SF); Rt Hon Oliver Dowden, MP – (OD); Clara Tsao, Entrepreneur and DWeb Technologist – (CT); Rishad Tobaccowala, Publicis Group – (RT); Julie Cordua, Thorn – (JC); Marc Antoine Durand, Yubo – (MD); Weszt Hart, Riot Games – (WH); Rachael Franklin, EA – (RF)


SF       Welcome everyone to the first Safety Tech hosted by Cognition X. Sponsored and curated by the department for Digital, Culture, Media, Sport with the Department of International Trade. So guess what safety tech this is the first event of its kind designed to help those developing innovative products and services to create positive value for their brands a safer online experience for users while also raising awareness among those users and for institutional organisations to learn more from potential partners. More importantly, and most importantly, is how we can take actionable steps. My name is Suki Fuller and I'm your host and moderator for the next three and a half hours, and boy do we have a lot in those three and a half hours right now. We have over 500 delegates and 30 plus preeminent speakers from all over the world here with us today and I am very excited. As you can tell I'm stumbling over my words, that's how excited I am. I'm sure that you're all aware of those personal emergency exits and toilet areas in your house so we don't have to do that health and safety brief, but as we all know with online events we request that you are courteous polite and refrain from any harmful or derogatory language in the chat, and also please do Tweet your insights, your questions your thoughts, your joy at what you're seeing here please follow @DCMS and CogX, @SafetyTechUK, @TradeGovUK and the hashtag for today’s event is #SafetyTech2021.  

Well we have some wonderful organisations to better help you navigate the sector in our events booth area Fair Play Alliance UK, Online Safety Tech Industry Association, Tech Coalition, WePROTECT Global Alliance, Global Internet Forum to Counter Terrorism, Trust & Safety Association, OASIS Consortium, Age Verification Providers Association, Internet Watch Foundation and the Online Harms Data Transformation Project.

As always all content will be available online after the event so you can check out everything we're talking about later. On that being said our open plenary with two terrific keynotes and two five side chats is from 1530 to 1630 and that will begin shortly, followed by simultaneously occurring breakout rooms and there you have your choice of five from 1630 to 1715. That ends with our closing plenary which I strongly urge you to stick around for. That begins at 1715 until 1800 but of course we know as with any tech event, there may be issues so bear with us if we do have some, and maybe possibly refresh your screen because it may be at your end. 

And now we have a brief ministerial message from the Secretary of State for Digital, Culture, Media and Sport, Oliver Dowden MP. 

OD         Well hi everyone and I just wanted to welcome you all to this world first safety expo being organised by DCMS and, thank you all for coming today, and as I'm sure you know we're determined to make the United Kingdom the safest place in the world to go online, and that means holding social media companies to account for the content on their platforms. Which of course we'll be doing through our Online Safety Bill, but regulation can only go so far and that's where the power of tech really comes in. Of course it's not a silver bullet but countless speakers at the conference today have shown how we can deploy cutting-edge technology to make us safer online, and countless other safety tech companies are using digital innovation to tackle the darker parts of the internet from grooming, revenge porn, disinformation and through to harassment. And, each of those innovations is helping us stamp out online harm bit by bit and that's good for users, but of course it's good for the tech companies themselves, so it's great to see many of you here today from places as diverse as LEGO and the UN and I hope you have a very productive conversation and I very much look forward to hearing your ideas.

SF          Thank you very much Secretary of State Oliver Dowden. And now to discuss the future of security trust and online safety online. We have our first keynote speaker with Clara Tsao and she is amazing. She's a civic technologist who's been able to really help develop and enforce the principles and policies that define acceptable behaviour and the content online with her organization the Trust & Safety Professional Association, and she's worked with so many tech organisations including Google, Microsoft and Mozilla. And, Clara has also served as the first Chief Technology Officer at the US Department of Homeland Security - that's impressive - where she was focusing on countering home grown extremism and foreign influence operations, but what's great is that we've got her because she's also on the advisory board for Number 10's innovation fellowship. So, over to you Clara.

CT          Thank you so much CogX and the Department of Digital, Culture, Media and Sport for this incredible privilege of addressing all of you today. Speaking at this first ever safety tech expo is a particular honour for me because I never imagined myself being able to say as a child – I want to grow up to protect people online -  and I'm so thrilled to see the safety tech sector blossom into the incredible community it is today. 

Some of you might be familiar with my work in safety tech, as Suki just pointed out, but today is the first time I’m sharing my story of my reason why. I fell in love early on with the digital revolution where I could use the internet to access any kind of information out there. However it was also the internet that nearly fuelled me to end my life 17 years ago. In my teenage years I grew up in a small town in northern California and was always striving under enormous pressure to be the model minority, and as a teenager I started using exercises as a means for stress relief. And what started off as what most people would consider a healthy and harmless hobby actually propelled me into the weeds of pro-anorexia websites. These websites were very common since the early 1990s, from live journals to Tumblr, and now TikTok, and as I was looking at these websites my mind swarm in blogs filled with emancipated photos of young women sharing inspiration for each other, comparing their thigh gaps and competing with each other on how low they could drop their weight and how soon they could be thin enough to die. There were quotes like “if your ribs aren't showing, you aren't trying hard enough” these websites were endless. Actually once you tried looking into one you would stumble across 20 more and within a few clicks you were already in this online echo chamber of a community of young people trying to end their life. And one month later after stumbling across these sites, my weight dropped to a mere 4.3 stone, which is in the US 60 pounds. Upon a check up to my physician I was immediately hospitalised and the doctors had told my parents “if your daughter doesn't get additional help, she will be dead by next month.” 

Eating disorders are actually among the deadliest mental illness today. Second to only the opiate overdose, and it took me a year and a half, and also dropping out of school, and several years of therapy, to make a full recovery. Like I said before, in the late 2000s, the growing popularity of online blogs led to a huge upswing in these pro-anorexia websites. In 2011, Yahoo removed 113 pro-ana sites from its servers. Myspace, Tumblr, Instagram, Pinterest, Reddit, many others followed, and in one study in the early 2000s pro anorexia websites outnumbered recovery ones 10 to 1. I share my story because my experience is not singular, especially in the wild wild west of the internet where there's measurable quantitative and qualitative offline consequences. And today these online behaviours is the culture that we choose to build offline with the products that actually haunt individuals communities, and our daughters and kids to come for future generations. 

Today there are many new social arrangements that are being formed on the web that help people find and share incredible social interest, no matter how distant they are. But they're also a lot of the bad. These online communities not only include people like myself 17 years ago, vulnerable girls battling eating disorders, but also 15 year old western school girls who are coerced as teenage brides to join ISIS. However today many people stand alive protected as advocates, doctors, mental health professionals and companies have worked over the past few years to limit the amount of toxic content available online and push back with healthier user experiences and demanding better content standards. However ever as other complicated issues have emerged in recent years, from online disinformation, to fake news, to terrorist content, to child exploration imagery, which Julie (Cordua) will talk about later, the need for collaboration innovation and the moral imperative for everyone to act remains more urgent than ever. 

Today bad actors are mobilizing work together in ways we are not as a global safety tech community, and most of the time these issues are not addressed until it hits a crisis moment, and especially when it only hits a western democracy. Today when people ask me “what is the best online marketing campaign?” I think about the tactics of ISIS. It's media empire from apps to blogs to propaganda videos that led 900 British citizens to join the Islamic state. They bought plane tickets, found information and instruction on how to travel overseas, and left. When people ask me about the best growth hacking tactics I think about the Russian bots, the internet research agency, and their ability to quickly build communities and amass followers and even lead to physical protests. When people think about best security processes and practices, I think about child exploitation. The amount of places where child abuse actors go that would not be able to be there today, the first to use end to end encryption, a lot of these secure messaging platforms were these every bad actors. And when people ask me what is the best return on investment, I think about Dimitri, one of 300 Macedonian teenagers who made over 600 thousand dollars in online advertising by creating click bait content during the 2016 presidential election. His website garnered 40 million views but he made enormous return on investment for spreading lies. 

And today I want to share three lessons learned from my time working across the private public and non-profit sector that has that can help us build a more civically responsible online cultures. The first lesson I learned is like bad actors we are not learning how to collaborate, these bad actors are learning from each other, from ISIS and white supremacist extremists looking at the ways that they're able to spread content.

Today everyone in safety tech has the best intentions and desire to collaborate but there are many barriers. We use different terms from what we've used safety tech industry uses trust and safety and a lot of researchers in academia are using platform governance, these terms are endless. We also tend to work in silos. If an issue is in terrorist content we don't talk to those working in child exploitation, we don't work across the aisle, and we are also very quick to blame. It's so easy for someone to accuse one company of not doing enough when they were doing nothing themselves and so we must stop working against each other and learn from each other, and recognise that everyone, at the pure intent, is trying to solve the same problem. We must collaborate when there is no crisis, not only when there is one, and it's so important for us to talk from content policy teams to talk to government, to talk to engineers, to talk to end users to really figure out how to problem solve. This problem is not solved alone just by those working in the technology sector.

Lesson number two is we have to stop thinking about investments in safety tech as a cost centre, but a longer one with ability for return on investment. We mentioned earlier in the video right, cyber security today was not an industry that existed 20 years ago and today most people don't see cybersecurity as a cost centre to companies we see that as a mandatory barrier to prevent major cyber-attacks from impacting data access, and today safety tech is not not of course even close to that, most people only invest when there is a crisis moment and bad actors have been doing this with longer term intentions at mine for many years. 

In 2014 Russia had created a fake explosion in the US called the Colombian chemical plant attack and they were trying to understand how the western media was using and responding to media online. That was 2014, two to three years before the US presidential election, and all these bad actors are thinking lightyears in advance of time whereas we only think about tools when something really bad happens or when lives are lost now is the time to bake our safety into the tools, and last but not least, there is there is no silver bullet we can only design the online culture that we demand. It's time to empower users, to empower moderators at major companies and there are already really great examples of this. A lot of people forget that today safety tech isn't just us, isn't just companies, isn't just governments, they're day-to-day users that also help support what we see is safe online. Wikipedia today as one example is an enormous community where there are thousands of community moderators that determine what is acceptable information on that platform. But as the web becomes more decentralised it's harder for the set tools to be effective it will be the mercy of users and safety tech architects to determine the culture that we want to see so we all have this strong world imperative and responsibility. Today I hope you feel the same urgency passion and energy as I do, that it's not one person one company or one one particular government's obligation act it is everyone's online civic moral imperative to build this community together. And today I’m thrilled that this conference will allow us to do that what are our goals by the end of this conference well I hope that in 10, 20 or 30 years this is not just the story of me, of how the safety tech community helped save my life and others just like me, but also the story of us, what can we do to prevent future bad actors from exploiting the online space and this is a story of now. There is no no no perfect time to invest in safety tech it is now and a long-term investment that we cannot afford not to invest in. So thank you so much everyone, thank you for this incredible opportunity and I will turn it back to Suki.

SF        Thank you very much Clara really great for just, well you know what, thank you for sharing your story because it is very poignant especially now when we have young people using online tools for school, and for social interaction more during this pandemic, and just more of them are susceptible. So thank you for sharing that how you use this house, well safety tech saved you, and as we know it does seem that bad actors gravitate to each other a lot faster than we do so collaboration is key. And that's really why we are here today and I would like to say first and foremost we have our first fireside chat with me, Staying Human in the Age of Data, and here with us is Rishad, I’m gonna mess up his name, Tobaccowala, I didn't woohoo, author of ‘Restoring the Soul of Business Staying Human in the Age of Data’,and how are you today Rishad. 

RT       Excellent, thank you Suki. Very glad to be here.

SF        Oh indeed, well you know I call myself an analytical storyteller and that's because I tell the story of data. I marry the quantitative with the qualitative, and you know what Rishad? You recently published an article about strategy being the future competitive advantage and you propose there are three unstoppable trends that every company and individual must align with, wherever they are in the world, regardless of the industry which they compete and it was globalisation with a new flavour, the three demographic divides and the third connected age of technology. And I know that I’d like to focus on that third trend, but I’d like to know, and let you know, and basically just let everybody know how you came about thinking that way in all the research that you've done previously, and as well as your awesome book last year.

RT       Well thank you very much, so I'm going to talk about the third connected native technology. As we know technology has been all has been around since the advent of fire. The internet has been around for a long long time, but I truly believe, and online's been around for a long time too, but I truly believe that the age we entered into which we're now trying to solve for began in 1993 when Tim Berners-Lee created the world wide web, and I call that the first connected age, and in the first connected age we connected to transact and we connected to discover and the companies in the western world that did that very well were Amazon and Google. In 2007 we entered the second connected age which built on the first one and in this particular age we connected both to everybody and this was through social media like Facebook and Twitter, and we connected all the time because of mobile phones, which basically were Apple and Samsung, and now we've entered the third connected age which again builds on the previous ones where data is connecting to data which is AI or machine learning, new ways of connecting which include voice, much faster forms of connecting, which is 5G and then we're continuously connected to the great god of the great eye of the sky which is basically cloud-based computing.

And what happened along the way without us knowing it is most of these advances, while they were created by government and the underlying technology was created by companies, were actually funded by advertising, so if you think about Google and Facebook and others, it was advertising. And advertising basically did good things for the internet but it also did a couple of bad things. And the two bad things that it basically did, was it focused on users and consumers and in order to get consumers and users clicking as much as possible it was basically built on enragement and engagement versus sort of enlightenment, that was sort of the one big issue. And the second one which began a big big issue is we also began to understand that microtargeting could be extremely powerful in selling things but microtargeting could also be very powerful in all kinds of things from bullying to the birth of polarisation etc. So I began to understand that, you know, without having a soul, without having overall meaning, without having other thoughts, we had this massive wealth creation engine which also was a massive social menace to a great extent. And what began as an advertising operating system ended up being a society operating system that now society needs to grapple with.

SF        Yep exactly, and you know what I, I thought was really insightful was in the framework of safety tech, you know, how are companies able to then navigate beyond those toxic forces that they actually sort of sometimes created depending on what type of company you know such as disinformation and harassment those are that causing the real and growing problems for them now. You know how do how do we stay human how do they stay human while reinforcing sort of their internal guidelines for employees and their customer base you know their users without sounding like a sort of a lock stock and barrel T.N.C, I mean, how do they do that? 

RT       It's it's it's difficult but also relatively easy if you keep the human in mind and people in mind versus consumers and users. So there are five quick things that people can do. The first one is start thinking about citizens and people ,versus just consumers, because consumers means you maximise for business, citizen means you maximise your society. So that's number one and as a result recognise that safety tech must find ways to offset polarisation, inequality and the breakdown in trust. Those are the three key things which is how do we restore trust, how do we find ways for people to engage with others versus polarisation and inequality; which is the key thing.

One of the ways obviously is to find the bad actors and to a great extent it's not just finding the bad content but finding the crowd and the instigator. Because when you find the instigators you can stop a lot and the whole idea is how do you find the instigators. And there are two other thoughts, one is as a company what are some of the criteria you should look at besides looking at citizens, and the criteria that everyone should look at as a consumer, I mean as a company, or even as an individual, is what is the safety of my data? Which is number one. How do I maintain and preserve my reputation? Which is number two. How do I have some control over my privacy? Which is number three, and four; how do I not get tricked and drugged, and how do I make sure that I have some optionality so I can get off this? So, those four which is data, reputation, privacy and optionality are extremely important. And, for all of us who are creating anything that has either services or technology or mind sets I use the word save ‘S-A-V-E’, right and I say a continuously keep thinking about society and social good, that's the ‘S’. ‘A’ is accelerate, figure out stuff fast right, a stitch in time saves nine, don't let problems fester. ‘V’ is values, values matter, there are good values and there are bad values, we may disagree with values but there is that you have to make a value judgement you can't basically say you know as a certain heads of Facebook and other people basically believe that all thoughts are equal. No. Science actually matters right being good actually matters all thoughts are not equal and at some particular stage we have to sort of think about it. And then finally is how do we correct errors? Both errors of others and errors of ourselves because we are human and we will make mistakes.

SF        I like that. Of course you know I do like acronyms, we all like the acronyms, so society, acceleration, excuse me, accelerate not acceleration, accelerate, values and errors. So there's one to take away people. If there's anything you remember it's the S-A-V-E acronym, and what you should do with your organisations and also, you know, what you actually can do with yourself right.

RT       It is, it is with yourself, and one of my most popular pieces I wrote this last Sunday, for those who want to see it it's at That's, is how do we architecture our minds? I call it mindset architecture, we think a lot about you know architecting technology and organisations, but eventually we have to architect our minds. And what are some of the best learnings and thoughts about that because well architected minds will make for a great society. 

SF        Oh yes they will, yes they will, you know what I I I’m still, right now I'm thinking about the values aspect. The values and the errors because I think that's when when it comes down to safety tech. I think how people value the words that they're saying especially with online harassment, we have misinformation, we have disinformation, it really comes down to how you value what somebody says and how those are errors fed into, I mean we we know you know garbage in garbage out, so it is you know how, how are you how do people navigate that? I mean I’m looking for answers from you so, how do people navigate that?

RT       People should navigate it whether you're an individual or working in government or working in a company or a technology, is you must think about yourself as a leader. Every single person is a leader, and we can decide as a group like how to lead, and what are the characteristics of leaders, are the ways we need to move forward and there are five characteristics. The first one is leaders are competent and capable, which is we need to be competent and capable of understanding both the ramifications of what we're saying and what we're doing. The second one interestingly is integrity, which is can you tell the truth? Can you be trusted? Right so speak with integrity deal with integrity, that's the second one. The third one is empathy, this is something that we are struggling with in the age of polarisation, right you know, as I tell people all of us sometimes are surrounded by our algorithmic feeds our own little Facebook friends, our own little groups all of them basically saying we're really cool, so at some particular stage it makes us believe that our flatulence smells like Chanel no.5. It doesn't okay so the whole idea is to think about other people in other perspectives is very important, so that's empathy. The fourth one is vulnerability which is wherever you go on and whatever you do you'll get hurt. There's vulnerability, how do you help people as well as how you surround yourself with people who can help you. And then the finest fifth one is inspiration, what is so amazing about this conference, and about you know the speaker we've had before yourself and other speakers, is in the end we need to understand that people choose with their hearts and they use numbers to justify what they just did, right, and we need to keep that emotionally human inspiration component. So it's those five competency, integrity, vulnerability, empathy and inspiration.

SF        Awesome and I really love the fact that you just said that you know ultimately we decide with our hearts and, and that's really what we want to do today is humanise safety tech make people realise that it's not just about the numbers it's not just about you know looking at the data it's, it's the combination of both of those that will make it make it what it really needs to be.

RT       Absolutely because while we are living in a data-driven silicon digital age, the people on the other side of the screen are analogue carbon-based feeling people that you, we choose with our hearts and we use numbers to justify what we just did.

SF        Exactly and thank you so very much. As always our conversations are never long enough, Rishad, never never.

RT       Thank you.

SF        Thank you very much for sharing your thoughts and your insights and your great knowledge and so, on we press, on we press onwards and upwards. Thank you very much and well our second keynote for the day ‘Protecting Children Through Technology’ is presented by Julie Cordua who is the CEO of Thorn, and that's where she's building technology to connect the dots between the tech industry, law enforcement and government so we can swiftly end the viral distribution of abuse material and rescue children faster. And that's from the blurb of her tech talk, which is highly informative, fascinating and also very emotive, but you should check that out also so thank you very much and over to you Julie. 

JC        Thank you Suki and thank you all for having me today and also for highlighting this topic of online child sexual abuse in this overall conference on safety tech. As I get started I want everyone who's listening to just take a minute and remember the time in your career or your life where you realised the power that technology could have. For me it was many moons ago. Early in my career I was working in wireless technology, and we, this was before the iPhone believe it or not, and we were, the company I was working with Motorola, was connecting some of the most rural villages in the world with cell phones. A one phone per village, and I remember hearing about this project and what would come out of it and you would see, you know communication with loved ones who had left, you had would see greater access to commerce as farmers and people who were making things could find pricing or buyers further away. And I looked at this early in my career and I thought, oh my gosh, this is just going to open up the world the opportunities that are going to come, and it just drove this sense of kind of optimism and belief in the power of technology for me. And I think for many of us who work in safety tech, or for me specifically I’ll speak for myself, who I work on ending child sexual abuse online every day, I have to hold on to that optimism, because I think all of us are often on the front lines of seeing some really rough things, but I want us to all anchor back to that optimism we had that probably brought us to the technology field.

But one important point in that, is that in this field we have to combine that optimism with realism. We have to look directly at the face of bad actors that try to compromise the beauty of what is possible in what we're building, and we have to build to oppose them. And we also have to look right in the face of those who are being left behind, and we have to make sure that we include them. Because if we do that we will ultimately build the internet that we all deserve.

I want to share with you the story of who I build for, so it's a little girl who at the age of five her parents divorced and she began to spend weekends at her father's home, and he began to abuse her and he would document it in images and videos, and he would share it online. And he did this and it continued for seven years, because as many of us know, perpetrators are very good at convincing their victims not to speak about the crimes that they are inflicting upon them. It wasn't until seven years that this abuse had gone by that a police officer knocked on the door of her mom's house with a picture that he had found online, it had actually been found on a computer in the Netherlands, and this was a child in the US, and asked the mom is this your child, and he said there's hundreds of images and videos circulating online documenting the abuse of her. They arrested her father, he's now in prison, this was a decade ago. A hundred and fifty thousand images and videos have since been reported to the National Centre for Missing and Exploited Children that have continued to circulate online, that are discovered on hard drives as cases are found, they've been on Facebook, they've been on Wikipedia. They have covered every square inch of the internet. And as her mom testified, the recovery of trying to deal with what came after the physical and emotional abuse, has been way more difficult than the actual sexual abuse. That child, and the thousands more like her, that are, have been abused, or are being abused today is who we at Thorn build for, and I do believe that most of you listening build for as well.

I want to help you understand the breadth of what I’m talking about because it is one child that we centre in our work, but it is much broader than that as well. The size and scope of the issue of online child sexual abuse material is massive and growing. It was an epidemic far before the global pandemic that hit this last year, but the circumstances of this last year also exaggerated many of the vulnerabilities that contribute to this abuse.

Last year more than 65 million files of child sexual abuse material were reported to the National Centre for Missing and Exploited Children in the United States, and that is just one clearing house. Most countries do not have a clearing house for this material or require companies to report, and so when I say 65 million pieces of child sexual abuse material, it is just the tip of the iceberg. It is just what is found when companies proactively look, and it's only looking at what is happening through one central clearing house. That those files were almost equally photos and videos. The majority of that content is depicting children younger than 12, and does include extreme acts of sexual violence. We're also seeing new trends emerge possibly conflated by the pandemic, where children were at home with their abusers not close to a mandatory reporters; teachers, doctors, counsellors, who may have reported abuse. We saw a nearly doubling in the amount of reports of online enticement, what that looks like is children online being approached by people, we see it happen in livestreams, where children have gone live and there's people chatting encouraging them to do more and more sexual acts in livestreaming, or people friending them and private messaging them, encouraging them to send sexual images that could either be then redistributed on child abuse forums or used to extort those children for more and more abuse material. Or people grooming them to meet up for in-person abuse. So a 97 percent increase in the amount of online enticement reports to the National Centre for Missing and Exploited Children, and other crimes that we're seeing emerge, is also livestreaming abuse whether that's livestreaming like I just talked about, children going live and being groomed in the chats, or it's a situation where caregivers will actually live stream the abuse of someone under their care for payment, which often happens between two different socioeconomic statuses in and around the world.

These are hard realities but this is about looking straight in the face of realism and really understanding who we are building a better internet for. When I become over whelmed by the reality I go back to my roots of optimism. I do think that technology, and more important all of you, the innovators behind creating a better internet can make a huge impact on the lives of these children and the generations to come if we place them at the core of what we are building. 

And so I want to propose a guiding principle and then three focus areas for work. First the guiding principle is what I’ve talked about already, putting the child at the centre of development. Talking to kids understanding their experience. At Thorn we've spoken to over 5,000 children through our research, and many million more online through our campaigns, hearing directly from them what their experiences are, what they need out of an internet designed to keep them safe and healthy when they are online. And then we translate those experiences into the heart of the products we build. We've learned from kids when you talk to them and you don't shame them, that sending nudes is becoming a normal part of them exploring their sexuality. 20 percent of 13 plus, 13 year olds and older, have already shared a nude online. We also learned that a third of sextortion victims that we spoke with had never told anyone about their sextortion until they confessed it in our research. We also know that younger and younger kids are being solicited online. A quarter of our sextortion victims we spoke with had been targeted younger than the age of 12. And we know that many of the kids that we serve and build for, cannot speak for themselves, literally too young to tell us what is happening but we look at the data and we build to make sure that their abuse stops and we prevent our platforms from being used to abuse children like them again.

The three principles I want to talk about are the three action areas I want to talk about. Start first with prevention, our first priority as a field should be equipping kids with the knowledge they need to enjoy safe online experiences. Many of you are already doing this, talking openly about how kids can engage with your platforms in safe ways, we should not be shy about the fact that there are perpetrators and there are risks to going online. We should understand that kids will engage in risky behaviour and send nudes. We should talk about it openly and we should help them understand how to safeguard themselves. We did this at Thorn through our ‘no filter’ campaign, mainly on TikTok which my kids loved, where we had kids talking to kids or kids talking to parents about their online experience. Imagine a 14 year old asking their parent if they had ever shared a nude, or at what age they thought it was appropriate to share a nude. When we posted these we saw more than 70 million people view these and share these and we began to duet with them and talk about their own experiences. We have to first take this dark topic out of the darkness, give it language, give these kids the tools they need to talk about this openly so that they are not at risk of abuse. The second area of focus is built-in product safety equipping children with the tools, not just the knowledge, but the tools to have safe online experiences. Our most recent research report that will come out in late April or early May, is talking to kids about what support networks or tools they use when they encounter potentially harmful experiences online the good news is. They are using the tools that the platforms are making available. They are blocking, they are reporting, they want more, but what they want to know is when they use those tools action happens. As my head of research explained, they are not reporting as frequently as they could for two reasons. One they think no action will happen, or what happens does not meet their expectations, she said it's a bit like putting a fire alarm on the wall so a child can pull it but it's not actually attached to any wiring to set off an alarm and for an action to occur. We need to connect the action the child is taking with what the reaction will be.

Another insight we learned is that often kids when they encounter harmful experiences they will block instead of report so that, because they think it's not a big deal, what they don't understand is the person who is probably doing that to them is doing it to many others so we need to increase the knowledge. And the third area is what are we doing to detect and remove and report abuse proactively as platforms not relying on children. As I said not every victim can speak for themselves. My belief is that every single platform that hosts user generated content needs to have a scalable proactive system in place to detect, report and remove child sexual abuse material.

And the tools exist today to do that. There are multiple options from building it yourself to leveraging tools available through multiple companies. At Thorn we built ‘Safer; which is a cloud-based tool allowing companies to easily spin up, detecting, reporting and removal services. Over the last year our customers have detected, have scanned, over 5 billion pieces of content to protect it from spreading child sexual abuse material online. And the good news is these types of tools are working. Last year we saw 45 new companies reporting to the National Centre for Missing and Exploited Children and overall we're seeing companies that reported experienced an average of 162 percent increase in the number of reports. Let me be clear, increased reporting and increased detection is good 66 million files is bad when it becomes, when it comes to bad actors, but good when it comes to detecting. There is more out there we need to shift the narrative and praise companies for implementing these types of detection measures.

I also know that all of this work doesn't happen in isolation all of this has to be surrounded with increased focus on policies, elevating this work to the c-suite and making it a priority in the company, and ensuring that regulators and legislators are based in realism as well. They understand the harms that happen online but also the potential that technology has to create an internet that we all deserve. Our team at Thorn through the technology that we're building and the partnerships that we deploy around the world with IWF and WePROTECT, and others, have set a vision of eliminating child sexual abuse material online over the next decade and I am so honoured to speak to this incredible audience of innovators today who also believe in that, in believing that we can create an internet for us all including our children and internet that we all deserve. Thank you so much for your work in this area.

SF        Thank you, thank you, thank you, thank you Julie. It's really some time sit hits heavy on your heart because while we're thinking about this being what you're doing in your work it's also the fact that this affects everybody this isn't just about the children it's about everybody. People that are here watching today that are running businesses that are CTOs, that are CFOs, that are whatever, your technologists whatever, this is also about the children that you have. It's about the children that are going to be using your products, it's about the parents of those children that are using your products, it's not just about, it's everybody and so it's really powerful what you said it kind of you kind of get a little teary, and it's like, oh I have to get back on camera after this can't be crying,

JC        I cry all the time so it’s ok. 

SF        So really thank you so very much I mean that the fact prevention the tools and detection and those are the things that people really need to focus on. Talk to kids, let kids talk to kids because they tell each other things that we don't know, and that's just the way it is, but thank you so very much and as I said everybody here you know follow, find out what Thorn is doing and just you know reach out and talk to people like Julie, please, please.

JC        Thank you so much Suki.

SF        Thank you very much and you know what now we've had that important call to action for children, who are our legacy, you know once again, just want to say check out the organisation in our event booth to find out better ways toto just provide ways into finding the technologies that she's been speaking about. Please do that, that's really important, and right now we're gonna move on to our second fireside chat, which is also me again and, don't know how this has happened, but building the online real-time communities of the future and we have some wonderful guests for this panel they're doing some great work. We have Marc-Antoine Durand who's the CEO of Yuba, and Rachel Franklin who is the Senior Vice President of Positive Play for Electron, excuse me, for EA and Weszt Hart who is the Head of Player Dynamics at Riot Games, and as my speakers all know they'll get about 30 seconds to outline what they do and then we're gonna dive in because we don't have a lot of time and I want to make sure it's used optimally. So I guess speaking, let's go alphabetically, so Marc-Antoine.

MD       Thank you, thank you very much. I’m Marc-Antoine Durand, I’m the CEO at Yubo and I’m in charge of safety and product policy in the company. So the thing for us, the thing that is really important about online communities and safety and the future of safety is really to work on three points, and I’m very aligned with Julie from Thorn, the first one I the main thing is that we have the data and the technology to really be able to go further. I think as a tech company we have the responsibility to go further in terms of safety so I think we can be. A goal here is to be more proactive. So for example I I’ll try to give you some very precise examples, for example, Yubo, we are able to detect any inappropriate picture about nudity, drugs etc, so if we detect that in less than a millisecond we are able to prevent user from not uploading this picture, so with the tech we have put in place we are able to be proactive and to make sure that bad things don't happen because we have the data so why should we let people like upload bad content if we have this knowledge. The second thing is that you know

SF        I'm gonna say Marc, you know because we definitely want to get into that and I don't want to say let's go back to what Marc was saying like two minutes ago when I’m gonna have Rachel and Weszt just do a quick intro because that's a point we want to come back to for everybody, so you know Rachel I think that's alphabetically, it'd be Rachel next right so just a quick quick intro, just a quick intro. 

RF        Well thanks for having me, I'm Rachel Franklin and I work at Electronic Arts. We make video games, and games are an incredible way for people to connect all over the world but it's important for us, in the positive play group, that we create online communities that are inclusive and safe and balanced and fair because if people don't feel safe they won't comeback, I think Rishad said that people choose with their hearts and so it's important that they feel like they want to be part of our online communities. 

SF        Awesome, and Weszt. 

WH      Hi my name is Weszt Hart, as Suki mentioned my role is the Head of Player Dynamics at Riot Player. Dynamics is a formal design discipline, at least we're shaping it into a formal design discipline, and it's really just about helping people play well together Riot games is known primarily for multiplayer competitive games, and in the heat of competition it doesn't always bring out the best of us, so we're trying to figure out well how can we make things better, right, how can we make games better so they're more rewarding and still keep you know the benefits of competition.

SF        Yeah, yeah we know we all get a little competitive. I mean I was like the queen of a lot of games when I was younger. I was not a very nice player but you know it was not Mortal Combat I was not very nice in those days, but you know as Marc as you were saying you know you have some excellent content moderation on your site. Your app is quite excellent in doing that and you've been pretty ground breaking, I mean you do things with warning pop-ups, and you built some solid aspects in your apps you know what are some other key takeaways that you can offer brands that really need to keep their community safe.

MD       Yeah the thing is as I said is that we have the data to, we have all the data to really make sure that we can go further, and so yes we, with the technology, we can be more proactive and we can also give more tools to users so they can protect themselves. So as you said. For example we are using warning pop-ups. so for example if you're sending private information about your postal address, we send directly to your warning saying “are you sure you want to send that because it could be dangerous to share your private information”, and we think that with this the technology we can enlighten the user give them more clear information about things they are going to do, and not only do moderation that could be like just delete inappropriate content, it's really, it's really about giving users as many information as they can, they can get. So and it leads us also to think about the future of safety as around education and prevention with, by using technology. So that I think that that's one of the next steps of safety is really to prevent also things from happening and give user all the information they need to realise sometimes that they could put themselves at risk. 

SF        Awesome, and you know I do like the idea of your warning pop-ups and, you know, letting people know in advance because a lot of what we see in many technology industries is reactive instead of proactive, and the fact that you're doing that it's really changing the mind set and the behaviours of younger people so that when they become older they're not just going to go click click click. Probably now you have teenagers that are actually reading t's and c's on websites because now they're actually realising, ‘hey I’m, there's something here that I should be aware of’, that's really great. I love that aspect changing the mindsets and I know that positive play has put up this great framework of things, I mean Rachel can you speak that, that is overwhelming stuff, and I absolutely love it, so you know Rachel could you speak more about that and let people know?

RF        Absolutely, I think you're touching on something that's so important, which is stating clearly what the guidelines are. How, you know we talk about being competitive, but we don't talk about how to be kind and collaborative. And so the positive play charter that we put up this summer really is trying to state those expectations in a very clear, friendly way to our players, but also set the commitments that we as a company are going to take. We have we do have proactive tools like text filtering, we do have in-game reporting and we have a cross-company our disciplinary matrix, which is basically taking those positive play commitments that we're making and upholding our end of the bargain if players are bad actors, but there's also other proactive things that that we can do like just elegant game design up front, and we have, one of our games Apex Legends has a ping system which means that you can communicate with your team members without voice chat, so it effectively circumvents in a way that is that that players can be particularly toxic. 

But I think we're trying to tackle this holistically in all the ways that you mentioned in a strong but humble humble way where we're communicating effectively.

SF        ‘Strong but humble’ I like that, and you touched on game design and I know that Weszt has a lot to say about the game design aspect and how you can shift the roles in making gaming experience safer, you know and what are you seeing in the future, where are you going, what does it look like, you know, how are you designing for that because I know that's your thing. Let's get the inside.

WH      There's a lot of stuff that we've got going on but I would say that I love Rachel's thinking. I love the way that the positive play framework is hoping to shape the thinking. I think the opportunities that we see in games are to simply make them better and the examples you use, the ping system and Apex Legends is it's fantastic right, but it's very much focused on the game and it keeps you in the moment and it helps you get more out of the game which, I think is really fulfilling the promise of just online interaction in general, and I think there's a natural arc that we go through when we get into this space. We start off with the punitive and then we begin to see the limitations of moderation, and ultimately it kind of aims for zero, it doesn't necessarily bring out the best of the game, the best of the experience, so you start thinking proactively like ‘how can we avoid these problems’ and then eventually you start realising wow our systems might actually be contributing to friction. Because, well we try to avoid the term toxicity because it's so subjective and it's such a judgment and you can't solve for that so you got to get specific, the more proactive you get the earlier in development you get and then you start asking different questions, you're not moderating anyone yet there's no one in your game so what do you start thinking about? Well, facilitating successful interactions, fostering healthy coexistence, and where I think we end up, and what I've been thinking a lot about and where we're looking in the future for the games that we're even just imagining, is what's where does tech go then right if you can think earlier what role can it play, and I think ultimately it plays a role where really moderation systems can never get, right, which is everywhere players ever are and it leads you to think about resilient communities ‘what can we do to empower players?’ I really think that the future is player powered and it's going to be defined by companies that think like theme parks. You know where it's just weaved in and it's all about the optimal experience, but shoot, even with League of Legends, our players taught us what was good about it right

SF        Yeah I see Marc and Rachel both, and it's like, Marc do you have anything to add there because I think that the mindset, it's definitely something that you guys are working on and you know, I always want you I know that the three of you together when you get talking it can go on for a while, but you know feel free to just, are you happily. 

WH      Calling me out there, a little a long winded. 

SF        You've been really good so far, I'm very impressed but Marc.

MD       You know, I know that you wanted to add to that yeah I agree, I definitely agree with that, I think it's the solution is not only to I think it's really it's necessary to detect, to create the best technology, to detect as fast as possible I think that's the evident part of the job but we have to create, like, patterns and product flows that make sure that the community is getting more and more empowered and educated around the way to behave online, and I think that's how the platform needs to, to work on that, because the thing is that you cannot only, like as you say, like punish people for from doing stuff, because at the end of the day you don't have any players or users, but you want your community together to do something good and to behave and to help each other, so that's why also at Yubo we really help make sure that people support each other.

We also want to reward the good behaviours, not only punish the bad behaviours, so that's really important, so to be positive about something so sometimes what we do is that when someone is complying with the rules we just say ‘thank you’, and that people just are happy because you're, there is an interaction, and that this so, that ‘okay I did something good’ and I can be part of this community and there are some rules that I'm okay with that so I think it's, it's really important. So to think in terms of community as a as a solution not only like find content. It's really more about profiles but also mostly like the groups of people so that's really important of design and product to be just really proactive, as you say Weszt.

SF        Alright, well you know, thank you very much Marc, and Rachel hey I got to get you all in there to get your words here. 

RF        Well I mean there's just so much good here I I think the proactivity is, is really important up front. I think also just thinking about the exposure of other players to these things and even your moderators so, the more we can catch up front the more we can weave it into the fabric, like Weszt was saying of, of how we make games I think is incredibly important. Also point out that that the positive play group which I run is really a commitment by our company to show how important this is, but it's also cause called positive play, it's not you know stopping the negative play ,so there's a there's a function here of ‘yes we want to stop the negative’ but we also want to model and promote positive so that we can foster these communities and empower these players and, I agree, this is this is how it's going to work is by empowering our players to have these self-sustaining these resilient communities to quote, to quote Weszt.

SF        And you know what, I think also when it comes down to wordsmith, when you say something like ‘positive play’ it automatically puts somebody in a positive mindset when they are actually in that framework, and as you said Weszt, you know maybe the word toxicity, maybe we need to get rid of that, we need to weed it out, because the word itself makes you feel like you're doing something wrong. I mean, it actually makes you feel dirty just to say the word even though you're not doing anything dirty, you're just explaining to somebody that this is not so good, so I mean, it's you know definitely how it is but, thank you very much everybody, I, we, with everybody that's here I could speak forever and I could talk to you forever and you could. There's so much information that you have that I really would like you to just make sure you keep imparting it to people out there in the world.

Thank you so very much Rachel, Weszt, Marc I really do appreciate you taking the time to speak with us here, and please continue the dialogue with others.

RF        We will we will.

SF        Thank you, thank you very much and so we'll be moving on it was an invigorating discussion. It really was reminding us how important the role of the gaming industry really holds in safety tech, and how it can be instrumental and basically forging positive change. I mean positive play, positive change, and that concludes our open opening plenaries, and we have five awesome breakout sessions until 1715. They are chaired by OSTIA, Thorn, OASIS Consortium, Trust and Safety Association and a very highly esteemed media QC, so please once again check out our event booths and sign up for more information, and well basically if you’re leaving please do come back. If you’re not going to break out session to enjoy our closing plenary so thank you very much everybody and enjoy your breako 

Want to know more?

Join here