Safety Tech at the UN

An international panel of experts put safety tech firmly on the agenda at the UN Internet Governance Forum on 5 November 2020, in an Open Forum dedicated to exploring the international potential of this exciting new tech sector.

An international panel of experts put safety tech firmly on the agenda at the UN Internet Governance Forum on 5 November 2020, in an Open Forum dedicated to exploring the international potential of this exciting new tech sector.

Across the world a new wave of companies are developing a huge variety of innovative products and services that help businesses better protect their online users from harm – safety tech.

Online safety is a global issue, and so the safety tech sector also transcends national boundaries. In the UK, for example, there are more than 70 safety tech companies, more than 50% of whom have an active export presence. Across the globe, the sector is already protecting the online experiences of billions of users, and recent analysis reveals that the UK sector has been growing rapidly at 35% per year. The sector remains relatively young, however, and public policymakers are only now becoming aware of the strategic potential of a sector that delivers economic growth and explores cutting-edge technologies, while making tangible improvements to peoples lives.

The UN Internet Governance Forum, a global multistakeholder platform that supports discussion of public policy issues relating to the internet, therefore provided the perfect venue to explore the opportunities and challenges of international safety tech sector growth – and the role that international collaboration could play in growing the sector further.

Online Safety Technology: Towards a Global Market – the Open Forum session

UK Digital Envoy to the UK Kevin Cunnington led a panel of international experts, from the UK, US, Israel, Ireland and Switzerland,  in discussing how safety tech can tackle a spectrum of harms.The panel heard how the regulatory framework proposed in the UK’s Online Harms White Paper had made space for a serious dialogue on online safety, and created an ecosystem for the safety tech sector to thrive. The panel also invited a global dialogue on standardised performance indicators, stretch goals for technology and encouraged international information sharing to benchmark high regulatory expectations on the technology available now and in the future.

  • “Cyber security focuses on protecting data and information from cyber attacks – safety tech focuses on protecting people from the psychological risks, harms, and criminal dangers online – everything from mis- or disinformation, to online abuse or harassment” – Professor Mary Aiken, Cyber Psychologist
  • “It’s great for us to see the emergence of a market of independent safety technology providers – and equally of platforms that are prepared to make available their technology to other platforms to help to raise the potential of the industry as a whole… We need to play our part more actively, along with government and industry counterparts, in actually enabling and encouraging innovation in this sector.” – Professor Simon Saunders, Ofcom
  • “The conversations about online safety are characterized by those who want the world to be better, and those who are telling us why it’s impractical, too difficult, or too expensive to actually achieve that. I think that technology can bridge that gap.” Ian Stevenson, Chair, OSTIA
  • “Our motto as a safety tech industry needs to acknowledge the rapid effects of online harms. If I were to suggest one, it would be, ‘We have to do better, and we can only do that together’.” Roni Gur, L1ght
  • “The UK was the first market where we got attention and traction. There has been a conversation in that market which we missed everywhere else.  So I’m testimony to the thought leadership that comes from the UK.” Deepak Tewari, Private.ly

recording of the Open Forum is available online, and a transcript of panel presentations and commitments follows below.

Panelists made a number of voluntary commitments to support growth of the UK and International Safety Tech Sector.

  • Digital Envoy Kevin Cunnington committed to ensure that the use of technology to facilitate safer online experiences remains a top five priority for the European Digital Envoys.
  • Chair of UK Online Safety Tech Industry Association Ian Stevenson committed to make industry a constructive partner in discussion and debates internationally, sharing their expertise.
  • UK Communications Regulator OFCOM committed to listen and to join the dialogue on safety technology. Simon Saunders invited companies and organisations to show off their technology and how it has made a difference to people’s lives to inform upcoming regulatory developments in the UK to tackle online harms.
  • Professor Mary Aiken committed as a cyber behavioral scientist to creating a better and more secure cyberspace. 
  • Deepak Tewari committed to ensuring that Private.ly measures and demonstrates the positive impact that use of safety technology has on the well-being of children.
  • Roni Gur committed that L1ght will support the growth of an open community around safety tech, contributing technologies, resources and ideas where appropriate.

DCMS will work with the UK Digital Envoy to convene further discussions with national and international stakeholders, to explore pathways to growth of the international safety tech sector.

————-

TRANSCRIPT – IGF 2020 OF #24 

Online Safety Technology: Towards a Global Market?

5 November 2020

VIDEO

SPEAKERS

Chair

  • Kevin Cunnington, Digital Envoy for the UK 

Panelists

  • Professor Mary Aiken, CyberPsychologist and INTERPOL advisor (Ireland)
  • Professor Simon Saunders, Director of Emerging and Online Technology, Ofcom (UK)
  • Ian Stevenson, Chair of UK Online Safety Tech Industry Association (UK)
  • Roni Gur, VP Marketing, L1ght (US and Israel)
  • Deepak Tewari, CEO, Private.ly (Switzerland)

AGENDA

  1. Welcome and introductions 
  2. Technology for online safety – a global market (Mary Aiken)
  3. The role of safety tech in online regulation (Simon Saunders, Ofcom)
  4. Opportunities and barriers to international safety tech growth – the view from business 
  5. Plenary discussion – how could cross-international collaboration help grow the international market? 
  6. Conclusion and next steps
1. Welcome and introductions 

KEVIN CUNNINGTON Hello everyone, and welcome to the online safety panel. I’m Kevin Cunnington. I’m your chair for this panel, and if I may I’ll spend a couple of minutes just describing why I’m  here today, and why I’m passionate about this subject. 

I am by background a technologist. I spent many years running the UK’s Government Digital Service, and acting as the head of profession for the 17,000 digital, data and technology professionals that we employ here in the UK civil service. I’ve recently taken a new role as Digital Envoy for the UK, and in that role I meet on a quarterly basis with other European Envoys, and we talk about key areas for collaboration.

These areas for collaboration are an interoperable digital identity;  an ethical framework for ai; and a number of initiatives around better data sharing and protection of data. We all work on how to develop more digital capability within our own civil services – and finally, saving the best for last, we work on making the internet safer.

You’ll hear from our other panelists that we’ll all make commitments to take this work forward. My commitment to this work is to ensure it remains a top five priority for the European Digital Envoys.

If i may, I’ll give you just a little bit of background on what we might want to talk about. I think most of you will know, people are generally very positive about the internet.  Eighty percent of us say it’s made lives better for people like us. But there is of course a dark side to the internet that we’ve all experienced – whether that’s terrorism, cyberbullying, trolling or the impact on children.

Technology can help us in this space. A good example from here in the UK is the Home Office, which has developed a machine learning algorithm that helps them detect terrorist videos on the internet, and then take them down. 

Last two points. Safety tech is very much on the rise – UK sector growth is up 35%, year on year, the sector  employs thousands of people, and contributes billions of pounds to the economy. Finally, we are seeing that safety tech is global. At least fifty percent of UK safety tech companies have an export presence. The three objectives for this session today are:

  • We want to share the work that we’ve done to define and measure the safety tech sector in the UK and abroad;
  • We want to explore the extent to which the challenges and opportunities in the UK reflect those across the world; and finally 
  • Discuss how we might take this forward internationally by collaborating together 
2. Technology for online safety – a global market

PROFESSOR MARY AIKEN Thank you Kevin. So today, we’re going to talk about safety tech. We’re all familiar with healthtech and fintech, and now we have safety tech.  We know that digital advances will drive our economies, we know that they will enrich our societies – but to harness the internet’s advantages, we also must confront the online threats and harms it can propagate.  We need to ensure the safety and security of those online. We need to factor in those who are vulnerable, in order to maintain thriving democracy and society where freedom of expression is protected. 

So, online safety technologies – or ‘safety tech’ – is defined as any organization involved in developing technology or solutions to facilitate safer online experiences, and to protect users from harmful content, contact, or conduct.  The online safety tech sector consists of companies whose products or services help online platforms to deliver safer outcomes for users, for example through moderation or filtering. So, you may ask, what’s the difference between cyber safety and cyber security? Well, fundamentally, cyber security focuses on protecting your information, your data – and safety tech focuses on protecting people. 

The point is, your data is never going to suffer from low self-esteem, your data is never going to feel the need for revenge. This is about factoring the human into the cyber security and cyber safety equation.  We know that it is critical that our networks and systems are robust, resilient and secure – however it is equally important that people are psychologically robust, resilient, secure and safe. At no time is this more important than now, in the middle of the global pandemic. So, cyber security focuses on protecting data and information from cyber attacks – safety tech focuses on protecting people from the psychological risks, harms, and criminal dangers online – everything from mis- or disinformation, to online abuse or harassment.

In 2019, I was an expert advisor to the safety tech sector report. We conducted a review of online safety technologies, and the good news is that we found a thriving safety tech sector, which complements the existing cyber security industry. It is valued in the UK at more than £500 million and importantly is growing by 35 percent year on year.  An industry sponsor, Paladin Capital, is currently funding our research to investigate the US and the rest of the world in terms of their online safety technologies, and our results will be published later this year. The good news is so far, there’s evidence of an emerging and thriving sector in the US. 

So, to wrap up, online safety is a multi-faceted problem with no single solution. It’s a global issue with many countries concerned about its potential for harmful impact on security, health and importantly on social cohesion. Tech solutions are also global in nature. We want to work together to create a safer and more secure cyberspace for all.

Safety tech importantly supports five u.n sustainable development goals –

  • goal three, good health and well-being;
  • goal eight, decent work and economic growth;
  • goal nine, industry innovation and infrastructure; 
  • goal 12, responsible consumption and production; and
  • goal 16, peace justice and strong institutions.

Safety tech can deliver an internet for human resilience and for solidarity.

3. The role of safety tech in online regulation

SIMON SAUNDERS (OFCOM) Thanks very much indeed, Kevin, and hello everybody. As Kevin says, I look after emerging and online technology at OFCOM – and if you don’t know OFCOM, we’re the UK’s independent regulator of the communications sectors. 

The role my team have at OFCOM is to keep colleagues informed of technology developments across all of our sectors. Traditionally that’s meant the telecoms, broadcast and postal sectors, but in the context of today’s discussion, OFCOM has an increasing role in keeping users of internet platforms safe online, while ensuring that those services can continue to improve in the usefulness that Kevin indicated at the beginning.

That role comes from recently acquired duties which have appointed us as the regulator of of some video sharing platforms that are based in the UK – but also from the fact that government has announced that it’s minded to appoint OFCOM as the regulator for online harms for all of the relevant services consumed by people in the UK. We’re figuring out what that means for us in terms of the relevant technologies, and what we need to know and to enable regarding those technologies. 

In that context, safety technology – ‘safety tech’ – is  especially relevant. We need to understand how platforms accept, moderate and distribute content, how they identify which content has a significant risk of being harmful to users, and how they will deal with that content when it is identified. Technology in terms of automated content classification, and matching to known harmful content,  and how that is distributed, standardized and kept appropriate and current is critical to us. 

Similarly, there’s an opportunity for systems that enable appropriate content to only reach the appropriate users – so that includes things like age verification and assurance systems, parental controls and potentially wider issues of our digital identity. In hopefully only extreme cases it’s also relevant to us to understand how those platforms distributing content of various forms interact with the underlying internet infrastructure,  and how content which hasn’t been responsibly handled by platforms can be addressed in a way that minimizes any harms that could arise from it.

So, understanding the effectiveness of that tech, and how it might evolve in the future, is really important for us in a number of ways. For example, in setting an expectation on platforms as to how they’re going to exercise a duty of care to keep their users safe. As a regulator we would need to find ways to measure effectiveness, and to give the platforms a clear and informed guide to what level of performance is appropriate at a given time. That’s  going to be a tricky thing to do, given the wide range of platforms, the wide range of content, and the wide range of user expectations – as well as the actual capabilities of the technology.

However it’s also important for us to understand how easy it is for the platforms to access and implement those technologies. So it’s no good for us to know that there’s some especially capable platform that has some amazingly effective content classification technique, if that technique isn’t available to any other platforms, or if the implementation and the running of that technique would require resources which it’s not reasonable to expect – particularly of smaller platforms.

So, as well as understanding the technical capabilities, it’s great for us to see the emergence of a market of independent safety technology providers – and equally of platforms that are prepared to make available their technology to other platforms to help to raise the potential of the industry as a whole. The market has a kind of ‘safety tech as a service’ potential, that it would be great to see emerging over time.

Safety tech as a whole, though, is a very new category. We’re one of the first regulators in the world to be creating a specific regime around regulating the online safety of platforms. So we may not be able as a regulator just to watch how this evolves, and interpret it for our purposes. We need to play our part more actively, along with government and industry counterparts, in actually enabling and encouraging innovation in this sector.

We’re still considering what this might mean in practice. It could involve actions like regulatory sandboxes to try things out, setting research challenges, and even building trials, test beds and proof of concept projects to really get under the skin of this technology. We’ll probably need to find ways of identifying metrics of success, and how to to benchmark systems against those metrics. We may need to play a role in making available sample data sets, so that the platforms can improve over time, and perhaps defining APIs for consistent interoperability of those different systems. This is a whole host of activities which are not necessarily typical for a regulator to engage in, but we think might be necessary in this world. 

Likewise, we need to give some thought to technical standards. There’s a balance to be struck there in terms of using technical standards to provide clarity and a wide market for the providers of safety tech, while ensuring that there’s wide applicability of best-of-breed approaches and that the setting of standards doesn’t in some way stifle the pace of innovation to the benefit of users.

Last but not least, what we really need is an international dialogue on these issues. There’s a helpfully thriving safety tech sector in the UK which is very helpful for us to have locally, but overall we want the best outcomes for users, wherever that technology comes from. So this dialogue, and what we might do as a follow-up to it, is really a great opportunity to get the ball rolling for ongoing international discussions and I really look forward to those. 

4. Opportunities and barriers to international safety tech growth – the view from business 

IAN STEVENSON, CHAIR, OSTIA  Hello, I’m Ian. I’m Chair of the UK Online Safety Tech Industry Association, which I helped to form after my own company Cyan Forensics started moving into the online safety world, and I felt there was a gap to bring the online safety industry together. Not just to talk amongst themselves, but also to communicate with policy makers, customers and to contribute to the public debate.

We really want to offer a voice of hope. Because too often, I think, the conversations about online safety are characterized by those who want the world to be better, and those who are telling us why it’s impractical, too difficult, or too expensive to actually achieve that. I think that technology can bridge that gap. There are members of the Safety Tech Association and other companies, some of whom are here today, who already produce products and technologies that can solve individual parts of the problem in really compelling ways, and help move things forward.

One of my observations coming into this sector is that everybody in this industry is motivated by creating safer experiences online, by keeping people safe, by creating that psychological safety. All of our businesses are driven by purpose as well as by profit. I think online safety is developing really rapidly, and often business – especially small companies such as startups and spin-outs – are the sources of innovation that can drive that sort of growth.

But for innovation to be effective and well-directed, we need to be part of a much wider community that determines what we as a society globally want technology to do, and what regulators are going to ask it to do and legislation is going to force to happen. So we need to work together to set standards, develop metrics for success and create environments for training and testing technology with real world data. 

I’m sort of delighted that in terms of a number of my points, Simon Saunders from OFCOM has stolen a march on me, which I think shows the degree of alignment we’re starting to arrive at in the UK. I’d love to see that emerging globally. I think it’s really important as well that as a community, we’re collaborating with those who safeguard other rights. So whilst I think freedom of speech and privacy are often used as excuses for inaction, there are genuine conflicts between online safety, freedom of speech and privacy, and we need to have that debate so businesses and – perhaps just as importantly, investors – can only invest in creating solutions where it’s clear that there will be a market for them. 

So, clarity on the direction of policy globally, because this is an international market by definition; clarity on what regulars will be looking for; and support from governments and users of safety tech will all be vital to getting the investment into the sector that we need for it to grow.  Policy and regulation also needs to be innovation-friendly. I’m going to repeat some of what Simon said again here, but it can be accomplished at least in part by focusing on the outcomes of the application of safety tech, rather than what the technology is – because that leaves scope for new inventions to be created. We also need to recognize that what a regulator should be asking platforms to do today may well be very different from what it should be asking platforms to do in two years’ time, because the world will move on, new technology will become available. 

So we see this as a very dynamic world if it’s going to achieve its true potential. Where regulators set today baseline and stretch goals, with an expectation that next year the stretch goals will have become the baseline, and there are new stretch goals for what people should be achieving. That also means balancing simple prescriptions, such as standards and APIs, which can be very compelling, against the need to enable and support innovation.

So OSTIA, the UK Online Safety Tech Industry Association, exists to facilitate conversation and collaboration.  We’ve started focusing activities in the UK simply because you have to start somewhere, but this industry is international by its very nature, and we want to be part of the global conversation and community. I think events like today are a fantastic opportunity to get these conversations and collaborations started.

RONI GUR, L1GHT  Hi everyone, my name is Roni Gur, I’m VP Marketing at L1ght. We build AI technology for commercial platforms such as gaming, messaging and web hosting providers, as well as law enforcement agencies. This AI technology is able to detect and eradicate at scale online toxicity such as hate speech, self-harm, suicidal tendencies, bullying, shaming and child sexual abuse. We’re US-based, with a R&D centre in Tel Aviv, and I’m really happy to be part of this panel. I’d like to reflect on a few challenges and opportunities. I think that there’s similarity between all of the panelists, and the voice that’s being heard – I’ll get to that in a second.

Let’s be very honest, the safety tech industry is very very young, and big tech has progressed very rapidly, leaving us maybe just a little bit behind. And this is no surprise. Social networks and big tech use narratives like, “move fast and break things”. If that’s the case, our motto as a safety tech industry needs to acknowledge the rapid effects of online harms. If I were to suggest one, it would be, “We have to do better, and we can only do that together”.  That sounds very very cheesy – but I’d like to try to explain why I think it’s also very true.

Safety tech is new. It’s young. It’s up against vast amounts of online danger. But we also need to acknowledge that government and private sectors can definitely work hand in hand, and complement each other.  Because you know what governments can or can’t do, private industry can or can’t do, and vice versa. Together we can get creative about how this ecosystem should grow, and what governments can do to make it grow.

So I thought about three main points. One is to understand the ecosystem – and you heard this before with other panelists. The safety tech ecosystem has various players – it has users, adults, children, me, you, anyone who’s online, regulators, safety tech providers, and the platforms themselves where content is being shared and people are getting abused. It’s very important to make sure that while platforms are part of the conversation, it’s also important to differ between them. Because when we say platforms, we want to think about Facebook and Twitter, and maybe even point fingers. 

What I suggest to do is make sure that we understand that there are bigger platforms, but there’s also smaller medium platforms. They both host online harms, they both want to eradicate it, and they both need a cost-efficient way to do it. If you are a smaller platform you need a cost-efficient plug-in, and you need it to be helpful for you. If you are a bigger platform like Facebook or Twitter, you can develop whatever you want but then only you have it, or only you have access to it – and usually platforms are just very much overwhelmed by what they find. So encourage governments to understand the need to move together, by mapping the safety tech ecosystem, getting familiar with technologies that are out there aimed at creating safer online experiences, and understanding that there are different players with different needs.

I also encourage two other points. The first one is that safety tech needs to be incentivized – whether it’s funding, academic research, subsidizing safety tech development, government or industry grants, sandboxes – anything you can do to make it worthwhile to be in the industry that is up against major forces that create online harms.

Which leads me to my last point. It can’t be so easy to create online harms, but so difficult to eradicate them. Regulation is really important. We all know this. But we all know that it also takes a long time.  So I suggest we need to certify, if not legalize, codes of conduct or permits for certain types of work. 

We’ve been in business for two and a half years, we’ve raised 15 million dollars, we have partners that are law enforcement agencies – and still we have trouble with certain content that can be uploaded to certain servers. There’s still the fact that child sexual abuse materials is so hard to tackle and eradicate because of its nature, and because of various rules in various countries. So if regulation takes a long time, and we need to start walking that path, we also need short and medium term goals like certifying.  

DEEPAK TEWARI  Hi everyone, my name is Deepak and I run a company called Privately in Switzerland. Our mission is to keep kids safe online, so all our technology is focused on making kids more resilient, giving them the means to be able to be protected, and to be part of their own safety. We do this through AI that works on their own phones or within their apps. So the angle that I’m going to take focuses on our experience from an internationalization point of view.

Even though we are a Swiss company, the UK was the first market where we got attention and traction. There has been a conversation in that market which we missed everywhere else.  So I’m testimony to the thought leadership that comes from the UK. And through having developed our first customers in the UK, we are able to show what the technology is capable of to other people in the rest of the world. 

Regulation is another big part. Talking to other regulators in Italy, Switzerland or in other places, I think there’s still a very big difference in perception of what constitutes online harm,  whether it is real, and where the liability lies. Is it with the device manufacturer,  the telecom operator or the platform operator?  Things fall between the cracks. So for the growth of this sector, we need to understand where the established duty of care lies, for young people but also for older people.

The other thing is perspective. The name of the company that we run is Privately, and something very peculiar about us is that we focused on trying to bridge this gap between privacy and safety. Some people say that to provide some safety measures, you might need to violate someone’s privacy. I’m here to tell you that we’ve developed tech that sits on the phone, and can be super private. And we are not the only ones. So what the regulators and others must know is that the ability to identify harms can be done without violating anyone’s privacy. Some platforms are saying that it is not possible. So safety tech companies must   bridge the gap between what we are producing, and what is being taken on by bigger platforms. 

CONCLUSIONS AND NEXT STEPS

KEVIN CUNNINGTON I’d like to ask the panelists to talk about their personal commitment to this subject going forward, and how we turn this conversation into some actions that help us achieve all the goals that we’ve talked through in the panel session today.

IAN STEVENSON Through the Online Safety Tech Industry Association, we are committed to being a part of this conversation. I’d encourage everyone, whether you’re in the UK or not, and whether you’re in the public sector or the private sector to look us up, engage with us, to look at some of the events we’re running. Our commitment is to try and make industry collectively a constructive partner in these conversations globally, so that if we were to reconvene at a meeting like this in five years time, we would be looking back and talking about successful deployments of practical innovative technology to address some of the key harms. I’m sure we’ll still be talking about the missing pieces of the jigsaw puzzle as well, but that’s the point I’d like to get us to. OSTIA really wants to be a part of that conversation, and we’re open to engaging with anyone and everybody who wants to join it.

MARY AIKEN We’ve been sleep-walking our way into an age of technology. We adopt each emerging technology with the collective wisdom of lemmings leaping off a cliff sometimes – and just because it’s new doesn’t mean it’s good. For me, technology is only going to mean progress when we can mitigate its harmful effects. Ian’s point is a really good one – to look towards creating an international standard for online harm. We’re doing some work at UEL at the moment in terms of creating with Ofcom taxonomies of harm, and we’re also considering how those taxonomies are considered in a developmental context. So I’m absolutely committed as a cyber behavioral scientist to creating a better and more secure cyberspace. I just want to wrap up by saying we should have three aims. One is an aim of privacy. The other is an aim of the vitality of the tech industry. And the third aim is one for collective safety and security.  And the point is about balance. None of these aims should have primacy over the other – and that’s the roadmap to best practice going forward.

DEEPAK TEWARI. I’m approaching this from an engineer and a technologist’s point of view. Whatever we are developing, we want to be able to measure the impact. So the tech that we are developing today, we can tell you that about 60,000 kids are using it, and it has a certain measure of reduction of online harms. So all the efforts that we are doing now, my goal would be to see how it is creating an impact. Is it able to reduce online harm as we know it?  Are we able to identify situations, mitigate them, make kids more resilient? Technology for technology’s sake is not great. So my personal commitment would be that for everything that we are doing, we will be able to demonstrate the impact that each of these technologies has on the well-being of kids. And by definition that would mean that you could be able to replicate that impact elsewhere.

RONI GUR. I would love to piggyback off that, and share that every board of directors meeting that we hold starts with our number one goal in the company – which is how many lives did we save. Then we can talk about profit and customers and all of the rest of the things. I think that’s unique for a private company. We want to create an open community – our commitment is that we will show up, we will open our books and we will commit to working together. We will put in everything we can – which is mainly technological but could be resources or ideas, because we have to do it together. When one thrives everyone will.

SIMON SAUNDERS. In terms of commitment, the key thing I can offer at this point is dialogue. We’re hungry to hear more from the providers of this technology – come and show us the technology that you’re working on. Where you’ve made a difference to people’s lives. How you measure those outcomes, and what is an appropriate way of measuring. Your views on what the state of the art looks like today, what could change over time and what we could do to shift that gradient upwards. It’s important to us that dialogue is truly international. The platforms, the sources of content that we’re dealing with are international and I’m quite certain that the innovators for that technology are all around the world as well. So while we’ll continue to engage very deeply with our homegrown providers, facilitated by Ian and the great work of our colleagues in government, we’d also like to hear from the rest of the world as well. And I can certainly commit to listening to those. 

Articles you might like