Back

Safety Tech 2021: Regulatory compliance: what you need to know transcript

Speakers: Christina Michalos QC, Barrister at 5RB and Queens Council and Author – (CM); Professor Victoria Nash, Oxford Internet Institute – (VN); Professor Lorna Woods OBE, University of Essex – (LW); Simon Saunders, Ofcom – (SS); Iain Corby, Age Verification Providers Association – (IC)

Session 

CM       Good afternoon, and welcome to breakout session four of Safety Tech 2021: regulatory compliance what you need to know. I'm Christina Michalos, I'm a barrister practicing at 5rb in London which is a specialist media and information law chambers and I'll be chairing our panel today.

The word regulation may lead you to fear that this is going to be a dull dry session, and that we will be leading you into a traumatising cat's cradle of red tape and bureaucracy from which you will never escape, but no fear not, this session is going to be about the value to business of regulation, social benefits, and hopefully dare I say it fun. If you're having so much fun you want to Tweet about it, please do, the hashtag is #safetytech2021. I’m privileged to be able to introduce four eminent speakers, but before I do firstly, I'd remind you that you have the opportunity to quiz these experts throughout the session so please take that opportunity we have some fantastic experts and you'd be foolish to miss that chance. And secondly, I'd just like to set a bit of context, for, particularly for our international viewers the UK position here, so it's easy to forget what life was like before the internet. Here's a quick quest quiz question for you: in the 11th century a group of scribes labouring continuously in an English monastery produced 66 books, how long do you think it took them? Have a think to yourself, have a guess, the answer was 22 years. In the year 1400, Cambridge University library had 122 books and it took them half a century to get to 330, but by 1957 IBM, on the IBM computer, five megabytes of space on magnetic disks cost about 70,000 per megabyte 70,000 dollars. By 1980 that had come down to 500 dollars, but by 2008 one megabyte cost 100th of a cent. 

The internet is ever expanding, storage stocks costs have come down exponentially and this brings great benefits, but the rise in user-generated content has proposed, posed many challenges, as has the online abuse. So in the UK the government has publicly consulted on online harms and the response was published in December last year. The government is going to legislate here to unlock innovation across digital markets, while also ensuring people are kept safe online and promoting a thriving democracy where pluralism and freedom of expression are protected. The Online Harms Bill will create a new duty of care to make companies take responsibility for the safety of their users, principally to tackle illegal activity place, taking place online and prevent children from being exposed to inappropriate material. But the legislation will also address so the types of harm that spread online including dangerous misinformation, for example lies about vaccines, to destructive pro-anorexia content. The new regulatory framework will apply to companies whose services are the host user generated content which can be accessed by users in the UK, or facilitate online user interaction whether publicly or privately when at least one of the users is in the UK, and it will also apply to search engines.

Freedom expression is very much at the forefront of this and so content and articles produced by new services on their own, or their own websites are not within the scope of the legislation and the government has expressly said that it recognises the importance of below the line comments and reader engagement with the news, so users user comments below articles will be explicitly exempted from the scope of the incoming legislation. This is going to be achieved by a low risk functionality exemption. The appointed regulator will be Ofcom and for our international audiences Ofcom is the Office of Communications, it's the UK's independent regulator for communication services which includes television broadcasting and we're very lucky to have a speaker from Ofcom today. So that's the UK context in which we find ourselves with regulation. 

So I'm very privileged to introduce the first of our speakers Professor Lorna Woods OBE who is Professor of Internet Law at the University of Essex and a member of the human rights centre there. She has a background in the regulation of the telecommunications media and technology sectors. She initially practices as a solicitor in those areas and maintained an interest in them when she became an academic. Her recent research has been working on a systems-based mechanism for regulating social media, utilising a statutory duty of care and has been funded by the UK Carnegie Trust we're very lucky to have her, again a reminder you can put questions in to all of our panellists throughout the session, please do so. Ladies and gentlemen, Professor Lorna Woods OBE.

LW       Hello and thank you Christina for such a lovely introduction. I'd like to take just a few minutes to elaborate on some points you made about the approach to regulation and the statutory duty of care. I've been wandering around telling people that this is systemic regulation, and by that, I mean the focus of the regulatory regime is not about identifying types of bad content and taking it down, but looking at how the platforms and the services are design and whether they can be made to be better environments for people by means other than just taking down content. And the heart of this is the statutory duty of care, and essentially, it's a form, of dare I say it, health and safety for the internet. So this is a risk assessment at its heart which is looking at the service overall, but also individual features of the service right the way through from when audiences or users sign up, go on to the service, but also through content discovery, and the way people respond, share, react to content, as well as complaints.

So that allows a wider range of options I think both in terms of what the regulatory space can do but also what companies who are within scope of the regime may do in terms of the way that they can react to that. It's an assessment that takes place in context, so it's about the services the platform actually provides, and what its users are. So it's not a one size fits all regime at all but allows companies to make their own choices and perhaps to develop their own innovative responses to threats that are out there in the environment. I suppose, what we're saying here is that this is analogous to privacy by design, but instead of looking for privacy enhancing technology, what we're looking for is safety by design and how as part of this, technology is never a silver bullet, but as part of this what can be done to make a platform safer. And this can be a range of things it could be about age verification, it could be about how content is promoted and disseminated, but it also could be about giving more tools to users to empower them to curate their own environment. But it all depends on I think what the platform sees its profile, its risk profile to be, including how its users use the platform. As you said Ofcom is going to have a role and I think Ofcom's role will be important, it will be interesting to hear what's said in a little bit, but I think it's important to recognise that the role of the regulator shouldn't be just about enforcement and imposing fines but maybe about understanding what good practice looks like in this space. I think I probably overrun my allotted few minutes so I’ll leave it there. 

CM       Thank you very much Lorna that was very very interesting particularly what you were saying about empowering users. So our next speaker is Professor Victoria Nash who is the incoming Director of the Oxford Internet Institute where she's a Senior Policy Fellow. She has a particular focus on internet regulation and online child safety but, Professor Nash. 

VN       Thank you very much for the introduction Christina and Lorna for the great preceding talk. So, I'm not a lawyer, I'm a political theorist by background, but much my work focuses on child wellbeing and children's safe and positive uses of the internet and it's with that perspective I suppose I like to frame my three minutes.

VN       As Lorna notes, the UK but actually also other countries, so Australia, Ireland, we've got measures now in the audio visual media services directive in Europe, many other countries making big advances in considering how children can be better protected by regulation in the online environment. And this trend is clearly only going in one direction so the focus today on safety tech and compliance and what companies can do to do more, I think is really important.

From my perspective though I think that having a very narrow focus on what's meant by compliance is actually quite unhelpful. I think I'd like to see companies going much further than doing the bare legal minimum in particular because this focus on children's interests gives us an opportunity to ask actually how companies can serve children's best interests online, which ultimately requires more than just protecting them. And this is particularly vital because if you look at, you know, how children use digital services, it's really clear that the online and offline are no longer separable. The internet is just a core part of their daily social life, their entertainment, their learning, their individual development and so any focus that's biased towards just protection or just risk minimization risks shutting a lot of those larger opportunities down. So, to me, safety tech is most exciting when it actually opens up the possibility of doing things like taking age-appropriate risks. For example, when it enables children, in a safe way, to enjoy key rights so things like participation expression or access to information. You know a good example would be the types of moderation service that we're seeing a rise, whereby children can engage quite freely in chat rooms with their peers, safe from the other, safe from the knowledge that, if you like, no inappropriate adult content will be will be enabled. 

So, to me what this really means is that I think there's a key principle. I'd like us perhaps to focus on in debates about compliance which goes beyond narrow legal and regulatory compliance, and is to see this as the jumping off point rather than the upper limit. So perhaps in an ideal world I guess we would see product design building in compliance and safety by design. At the very earliest stages of product development rather than just bolting them on at the end and checking that they're legally compliant. And so to me, you know, if we have a group here of compliance officers and regulatory specialists I would say, you know, you have a role to play not just in in selling safety tech but actually arguably producing technology which will help children flourish. So yeah, my message overall is let's set the bar pretty high if we can. Thank you.

CM       Thank you Vicky. I think that's really interesting about bear legal minimums and hopefully safety tech will develop as a sector to be analogous to work in diversity where companies that take the lead in diversity and have a more diverse workforce, both social diversity and ethnic diversity, are shown to be more successful and have greater revenue. And that those who take the lead in safety tech will similarly benefit and will be seen as leaders, so I agree with you I think it isn't about the bare minimum and hopefully the our audience here today are here because they're interested in this and that's their approach too.

So our next speaker is Professor Simon Saunders who is the Director of Emerging Online Technology for Ofcom, which as I said is the UK's independent regulator for communication services. He leads teams responsible for monitoring and enabling the positive impact of technological innovation on UK people and businesses across the sectors regulated by Ofcom. And this includes building Ofcom's capabilities in technology relevant to its evolving duties and online content and services and as I said in my opening Ofcom will be the UK online regulator. So I’m sure our entire audience and me as well are waiting with bated breath to see if Simon’s going to spill any beans. Thank you, Simon. 

SS        Thanks very much Christina and good afternoon everybody. It's great to be speaking at this event, which putting together safety and technology is right at the intersection of the things that I'm responsible for at Ofcom. That's been a great introduction from everybody and setting the context for our work here. You know as the independent regulator for communication services in the UK, we really want the whole range of those communication services to be available and useful and safe for everybody and I would say more useful and more safe over time to the, the points about you know bare minimums and so on, and clearly amongst those communication services, increasingly, that involves online services of all sorts, on the internet, but also the application of internet style online technologies in the evolution of the telecommunications and broadcast services that we already regulate. Most particularly, as has already been pointed out you know as a role now as the independent regulator for video sharing platforms based in the UK, and increasingly over time having been named as the regulator for online harms across a wider range of platforms, our duties are expanding to include those online elements and, you know as an example of how active we already are in this area. I'm pleased to say that today we've published a range of research and of guidance relating to the video sharing platforms aspects of that, so there's some reading for everybody here and that's a kind of indication of how  we're trying to make it you know clear for platforms what it will take to comply with those regulations as far as we possibly can.

Now, all of those services that I've mentioned are created by technology and are underpinned by technology, it creates the opportunities and the usefulness of these of these services, but it also as it proliferates, creates challenges in keeping people safe with that that engagement with those services. The technology also though creates the potential to be the solution, or at least a big part of the solution, to those same challenges if we all did appropriately by those platforms. So understanding safe safety tech is right at the heart of what we need to do in this regime and it's not realistic for us to assess individual elements of content given the volumes involved and the rate of change of the nature of these platforms, which we might have done you know previously in some of the sectors that we've regulated over time. So as Lorna highlighted, you know, there's this concept of a duty of care which will instead refer to figuring out what that duty of care actually constitutes any given time is a more complicated piece. And clearly given that this is fast moving, one of the things we need to do is to maintain a really good understanding of the capabilities and the effectiveness of safety technology, both today and the sort of gradient overtime for understanding whether platforms are complying with our duty of care. As well as the effectiveness of that that safety tech we also need to understand what it's like for platforms to access those technologies. Clearly if only a few large platforms can afford to or can build that safety tech, then that's not, that may not be proportionate and appropriate when we think about the large range of platforms, large and small, so, you know, that the access to the technology is as important as the effectiveness of the technology itself. When we're talking about safety tech here, we you know there's a wide range of issues, certainly age verification estimation and assurance is important, so is automated content classification, a wider range of privacy enhancing technologies and even it's not only safety tech, per se, it's also important that we have a good is understanding of the underlying internet infrastructure and the standards for that which might change the way that content actually reaches people and businesses in the UK. 

Likewise, it's not only about understanding the technology, I think we also need to play our role in actually enabling and encouraging innovation in these areas, to raise the bar if you will over time keep people even more safe, even as the services proliferate. So, to do that we've created this online technology team that's dedicated to that range of work that's led by my colleague Fred Langford who joined us from the Internet Watch Foundation. We already have some great knowledge and experience within that team but we're building that team further to face the challenges ahead, so you know if people watching are interested in joining that team please do let me know, and it's not only the team within Ofcom that will do this but we'll also draw increasingly on a wide range of external experts, other adjacent regulators, external research that we will commission over time and so on, to keep that knowledge current and informed by the broadest range of perspectives that we can get. Thank you very much.

CM       Thank you Simon. I’m sure there's going to be lots of questions for you shortly after our next speaker. A reminder again that you do have a, thank you for your question so far, that it's a reminder that this is a limited opportunity to quiz our really excellent speakers, you may have heard the famous phrase ‘famous dead people make excellent commentators on current events’ as Francis Bacon said ‘a prudent question is one half of wisdom’. So that leads me into our fourth and final speaker Iain Corby who is the Executive Director for the Age Verification Providers Association, which is the global trade body the providers of privacy protecting standards based age assurance technology. Previously he was a Deputy CEO of the charity Gamble Aware and ran a research team in parliament, and was a management consultant with Deloitte's for 12 years, during which time he gained an MBA from UCLA, and he studied PPE at Baylor College, Oxford, so he's ultra-qualified to be talking to us today so thank you Iain and please fire away, the floor is yours thank you. 

IC         Thank you Christina, I normally make a joke that I studied PPE which stands for ‘pretty poor education’, but as Vicky studied exactly the same subject as me at exactly the same time at Oxford, I probably better not make that joke today. I wanted to talk a little bit about the avalanche of legislation that's coming not just in the UK but across Europe, and in fact around the world, which might be a little bit intimidating I think if you're looking at this from a compliance officer or a legal perspective, and because there's just so much going on and you've heard already references today to the audio visual media services directive, to how GDPR is now being further interpreted in terms of more specific implications for appropriate content that children can see. We know that in the US, COPPER is being reviewed, it's taken some time and there's a new administration but I’m sure they're not going to leave that untouched. And one of the main principles that we hear so often is that what should be illegal, or what is illegal in the real world should also be illegal online, and then you know trying to implement this principle, governments sort of come up with a clever bill like the Online Safety Bill which we're expecting shortly, and perhaps try to do that all in one go when we know that for every product and service around the world in every jurisdiction there are multiple laws trying to deal with the peculiarities of each of these different entities, and the regulations that go with their sale and use, and so I think it's going to be an evolving feast as we try to apply that principle of equality in the online and offline worlds. 

But one fundamental input to that is knowing the age of a user at the end of a keyboard, and that's obviously a pretty difficult thing to do a lot harder than it is in the real world and we so we're here really as age verification providers to try to help provide that basic piece of data really on which then other safeguards can be built. Now we as a trade association have been working to build standards, to implement audit and certification against those standards, so that ideally what we can do is encourage regulators around the world to adopt a very similar attitude to how they implement age assurance, and so that we are not in a position of having to reinvent the wheel in every different jurisdiction for every product, service and piece of content. So there's really exciting stuff going on, I think to some extent the issue we have at the moment is the regulation is getting a little bit ahead of the technology, the age-appropriate design code in the UK asks websites to distinguish between the content they provide to a five-year-old and a ten-year-old or a fifteen-year-old, when we all know it's actually very very difficult to discern the age, essentially whether somebody's an adult or not, and certainly much more difficult than distinguishing between different age groups of children. And we're also working to deliver sort of in interoperable solutions between providers which is something we'll be working on in the UK and we're now moving forward to work with the European Commission to try to deliver that not just for age verification but also parental consent which is another really important element of this. Hopefully that's enough of an introduction and I'm sure Christina will entertain some questions.

CM       Thank you so much Iain, yes thank you, thank you for your questions. Do keep them coming in because we've got until the end of the session, so we can take any further questions as they come in so don't think it's too late. 

So I'd like to start by taking a question here ‘do you think safety tech companies might have a broader impact on online norms, practices and morality, I suppose, beyond children's internet use?’ I'd like to put that question first to Vicky please.

VN       Thanks so such an interesting question, I think, I think if we do it well then it definitely will have a broader impact. I mean if we reflect back on prior technologies where we've introduced similar protective measures, even something simple like for example age rating relating to films, you know, those were primarily introduced so you could identify you know what age you had to be to go to a, to go to a movie, but now you know it's very much entered common understanding as to what type of content that signals for a particular piece, for a particular movie for example, so you know we can use it to make judgments to inform our watching. For me, I think one of the areas where I'm really hopeful that we might see safety tech having a big difference actually relates to the measures that both Iain and Simon were talking about, which you know in particular relate to things like data protections, or privacy by design and you know in an ideal world, actually if we do this well so that actually, you know, some of the some of the sort of most extensive data collecting practices are only opened up to us on the provision of credentials that show we're an adult, and if you like it, so it just suggests that there is an alternative way of seeing this and doing these things so I think yes in an ideal world, and if we do this right, then actually safety tech will actually advance some of our sort of broader social understandings of how these technologies work and what our broader social choices could be.

CM       Thank you that leads in actually to another question that's just come in ‘have you considered the idea of offering identity verification to all users, then allowing users to opt in and out of only seeing content from verified users?’ which is an interesting idea I'll come to it in a moment, but Vicky, did you have anything you wanted to say in response to that?

VN       Yeah, I'm afraid I am, I am a fairly strong defender of the idea that the internet being, you know that it should be possible to be anonymous online. I think you know we're very lucky sitting in a liberal democracy whereby you don't, well on the whole, you shouldn't get arrested for holding a particular political view and stating that in public. But I think that yes, the idea that everybody has to identify themselves before they can go online is, you know it would be quite a significant infringement of our freedom of speech, freedom expression.

CM       Yeah, I think, I definitely think that's right. I think sometimes people fail to appreciate the huge freedom of expression, value and anonymity, that it's seen to, only through the lens of a shield to hide behind when people are being abusive or online, but we need to remember that anonymity is important for whistle blowers, people who feelable to speak publicly. I mean certainly in the legal field there's a well-known legal commentator on Twitter who goes under the identity of ‘the secret barrister’ and I think ‘the secret barrister’ has said that they wouldn't feel able to comment in the same way if their identity was known. Obviously, they've also written a book on that, so I think that it is it is something important to remember but it would be presumably possible to have certain sites where people wouldn't necessarily be concerned about appearing anonymously identity.

Can I put that question to Iain as well please? What’s your view about that? About this notification?

IC         Very briefly when you walk into a bar, the bar man doesn't need to know your name, he just needs to know that you're over 18. And you, I think it's important to maintain this distinction between age verification and identity. I think it's particularly important when we're dealing with children. So if children are trying to prove that they're old enough to see something online I really don't think we want to be encouraging them at the same time to be banding around their name and address online. Simply because they want to access a particular piece of content, and so I think they'll always be an important distinction between identity and age attributes.

CM       Thank you. Lorna did you have any comments in respect of either of those topics?

LW       Yeah I think there's a difference between requiring everybody to identify themselves all the time, and a position where maybe a platform chooses to say, you know, you can identify yourself and we’ll show, I don’t know an equivalent of a blue tick or whatever. But you can come online and not be identified and you get a great tick, so that other users can make their choice about who they want to contact with and who they want to listen to because I think the freedom of expression debate is very much focused on the speaker, and I agree with you entirely about importance of anonymity. But I think we need also to look at the control of the users and the recipients of content. But as I think you mentioned there's a role also for thinking, well maybe different platforms can do different things, and as the market currently is there's a great tendency to think ‘oh well’ the way Facebook does it for example is the way it is.

Well there may be a range of other platforms out there and in five years’ time maybe the market will look entirely different. And as a final point, and I suspect Iain might like me saying this, there's a difference also between saying the platforms themselves get hold of identity data and that verification process, and maybe the existence of independent operators who actually owe fiduciary duties, say to their users, so that when they choose to identify themselves, they're not actually, maybe, having to hand over all their documentation or whatever every time. And that that's a vehicle maybe for very verified pseudonymity. 

CM       That's a word I wouldn’t even try and say. Thank you, Lorna. So, unsurprisingly, we've got a flurry of questions for Simon. So, Simon for one of the first questions is the guidance released today related to the video sharing platforms. How does that guidance relate to the work the ICO, that’s the Information Commissioner's Office in the UK, is doing around the age-appropriate design code?

SS        Yeah really good, good question so I mean I think you know we're sort of building on the shoulders of giants in the work that the ICO did in the ages of appropriate design code. You know the very concept of age assurance you know was really first referenced by that that design code in sort of, you know, applying and enabling age-appropriateness standards, and I think that was a real milestone in the developments there. But dare I say there's another milestone today which is the in the guidance that we put out, this is really the first time we've sort of shared a public position on age verification, age assurance measures, in that consultation so I'd invite people to take a close look at that and do let us know what you’re your thoughts are from that. That is a consultation that you can respond to, you know in particular that sets out that, you know, we do expect the video sharing platforms have an appropriate and robust access control system when they're hosting material that that is only appropriate for those above the age of 18, such as pornographic material and so on. 

One thing I’d say overall, and you know this this goes very much to questions of innovation and the way that Ofcom looks at this, is that, you know, we're very much, you know, we're keen, as I said before to establish that that technology can actually play an important role in assuring the safety of users. And that that goes for example for age estimation and verification we'd like to establish the outcomes that we'd like to see, rather than the means for seeing that, because the platform should be able to evolve and improve their capabilities to do that over time. On the other hand I would say that means that you know to really understand what's possible there we're not agnostic to the particular technologies that are available at a given time we really need to understand those in some detail and indeed to work with them, and so for example we will, you know, experiment with those technologies ourselves so that we really do understand them, and, you know, the discussion that that you held previously about the relationship between identity and age verification is to the point. We've experimented ourselves in looking at digital identity platforms and learning from those what their strengths are and what their limitations are. So I hope that's helpful.

CM       Thank you. Following on from that it's one of the questions there is ‘it's been claimed that regulation risks constricting innovation is it something that Ofcom is conscious of?’. 

SS        Well we're very conscious of it we have written into the Communications Act 2003 as part of our making regulations in the areas we're responsible for, we have to have regard to innovation amongst other criteria, so it's very much written into our DNA to be aware of innovation, but there is a sort of, you know, question sort of how exactly we fulfil that and any sort of equivalent duties we may have. You know we could just sort of regulate, do regulation all the innovation, if you like, pay passive attention to where the innovation is happening and take that into account to make our regulation robust and we certainly do that and we'll continue to do it, but broader than that is, is the regulation for innovation actually making that regulation pro-innovation without, you know, restricting at the boundaries that there's only one form of innovation. And you know I think it would be fair to say we've probably paid more attention in the past to the regulation of innovation than the regulation for innovation, and we're very keen as we, you know, we adapt to the wider sectors we're regulating that we that we're actually actively enabling that innovation to happen and to continue to happen. With a background that we want the services to be as useful and as safe as possible for everybody.

CM       Thank you, then a quick question for Iain ‘if you want children to get to stop lying about their age, you need to stop asking them how old they are, how do you achieve this with technology?’

IC         That's a very good point and I think one thing to remember is this is a fastmoving area Simon was alluding to, and I would strongly suggest that companies who are worried about how they stay compliant in this field, find themselves a partner within the, in industry, a bit like you might have an accountant to keep you up to date with the local latest tax laws, or even a lawyer dare I say to keep you up to date with the latest legal events, because it is moving very fast. And there are some attempts at age assurance where we don't need to ask questions where in fact it can go on in the background, but you come up with a bit of a catch-22 sometimes which is if you've got a brand new user on your platform you have no data about them in order to do any of this analysis. So you have to sort of put them into a safe space until you've figured out their age, well if all they're doing is watching Noddy videos on YouTube you're never going to know that you'll be able to figure out from that what their age is. So, it is a challenging area and at the moment we just need to go back to hard data sources to provide a reasonable level of assurance about somebody's age, with perhaps the exception of facial analysis where we are now getting pretty pretty accurate to within a couple of years to know somebody's age without needing to ask to see a passport. 

CM       Thank you. We're coming to the end of the session so I'd just like to ask all of our panellists, starting with Iain since you're here, what practical tips, what's your top practical tip or practical tips you'd give companies and legal departments of companies who want to start preparing now for regulatory change. So starting with Iain.

IC         So I always think it's better to be ahead of the regulator, sorry Simon, rather than be following them and so yeah, we're doing our best through setting standards with the BSI past 1296 to establish a way forward which will mean that if you're following some of those standards and best practices, you're unlikely to be bothered by regulators. So work with a partner in the industry. 

CM       Thank you, Lorna?

LW       Sorry I was on mute there. I would say be aware of what's going on your service. If this is about risk assessment you possibly need some idea of where the risks lie, and the starting point for that is find out what's going on. So yeah, find out what's going on.

CM       Thank you, Vicky?

VN       I think I'd say what I said before, which is ask your product design team how regulatory compliance can help them build better more exciting products.

CM       Thank you, and finally Simon, the last word on top tips. 

SS        Well three top tips, if I may so one is, first of all you know, read our guidance and read what we have to say on these things and there's plenty of that today to chew on. Secondly please come and talk to us talk to us about what's, what’s unclear for you, talk to us about what you're trying to achieve. And, thirdly, and I think actually most important, is think about the people you're serving and think about the businesses you're serving and think about keeping them safe and doing your utmost to ensure that, because at the end of the day that's what we all want to achieve. 

CM       Thank you very much. So I'd like to thank all of the panellists today. Thank you, thank you, thank you for your contributions. I certainly found it very interesting I hope you have watching wherever you are in the world.

We're obviously moving into a world where in where companies are going to make better use of technology to protect their users, and we can see from the things that our panellists have said that safety tech can empower users and help people make the right decisions, and enable companies to deliver on their terms and conditions, and commitments to their users. So online safety has many benefits. 

Thank you everybody, I would encourage everyone to join, and thank you to you for watching and for your insightful and thought-provoking questions. I'd encourage everyone now to join the closing plenary session, ‘Together Towards a Safer Future’. There are some fantastic speakers including Susanna Storey, she's the Director General at the Department for Digital, Culture, Media and Sport.

Michèle Coninsx, who's the Executive Director of the UN Counter Terrorism Executive Directorate, and Professor Mary Aiken who is a Cyberpsychologist - I don't think I've ever been so jealous of somebody's job title in my life, I would like to be a Cyberpsychologist. So thank you for joining us thank you again to the speakers who deserve a round of applause, albeit remotely, please do join the closing plenary session have a good evening, and stay safe online thank you very much English.

Want to know more?

Join here