In January 2021, Marc-Antoine Durand, COO of Yubo, contacted me. Founded in 2016 in Paris, the company created a social video live-streaming app with 60 million users worldwide. In February, I joined the company’s Trust and Safety effort to help to make Yubo a safe place for a community of young users meeting and socializing online. Regarding the sensitivity of the complex issues and problems covered by any Trust and Safety department, I won't be able to illustrate this client work with a detailed case study starting from a real user problem to the solution implemented through the whole double-diamond design process, as I’m used to - I will mention these issues more generically though. Instead, I will in this post emphasize the Product Design specificities I have learned throughout this year-long mission with three questions :
Nonetheless, to get a better picture of my collaboration with Yubo, I have listed at the end of this post my main achievements for this mission.
Role : Freelance Senior Product Designer with an extended product role : product thinking, product management, user experience design, prototyping, user testing, user interface design, development management and quality assurance.
Period : March 2021 - March 2022
Even if the “Trust and Safety” function has not always been called with these terms among organizations, Trust and Safety dates back from the early days of the internet, when the first read-only websites started to boom in the mid and late 1990s, more than 20 years ago, with the overlapping influences of computer security, early legal concerns like online copyrights, and content moderation for some of the first popular websites based in the Silicon Valley like Yahoo (web portal, 1994), Craiglist (classified advertisements, 1995), or Google (search engine, 1999).
Depending on the company and the nature of the product, the focus of a Trust and Safety department can range over a large variety of issues like spam and fraud, child sexual abuse material, hate speech and harassment, incitement and terrorism or more recent online issues like suicide, self-harm and mental health or misinformation and disinformation, most of the time combining several of them, as more as products become more social. Yubo, as a live streaming application used by young users - including a high proportion of minors - has to muscle its Trust and Safety effort especially on identity verification and live content and behaviour detection.
We can sum up the Trust and Safety activity as the effort an organization is deploying on studying how their products are likely to be abused and enforce principles and policies to prevent any cause of online arm, thus building trust with their users.
From the solo system administrator who had to manually remove controversial content on a web page in the late 1990s, to today’s Trust and Safety departments employing then of thousands of people worldwide using the most advanced technologies in the biggest tech companies like Google or Meta, Trust and Safety has become more and more professional over the years with the blooming of a rich ecosystem with industry experts from different fields, professional associations or more recently the creation in 2021 of an appropriate academic research journal by Stanford University.
As billions of people are interacting online, for some of them several hours per day, with more and more advanced interaction abilities - think about what’s coming next with social virtual reality -, some major events of the past few years have already raised strong ethical concerns and shied light on the enormous responsibilities tech companies and their Trust and Safety departments are facing. Think about the turbulent 2020’s US election and the suspension of the main social accounts of the President of the United States; the 2021’s Facebook Papers and the internal document leak reporting that the company was fully aware of the negative impact on teenage users of Instagram, and the contribution of Facebook activity to violence in developing countries; more recently the spreading of violence and misinformation following the Russian invasion of Ukraine on platforms like TikTok.
The question a Product Designer has first to clarify is, who are the users? In Trust and Safety, the answer is notably diverse. We can first bring out two macro types of users .
Main product end-users : all users we want to protect from any harm to enjoy a trusted community with their own singularities : country, culture, language, ethnicity, etc. To give you a few examples, for those users, a Product Designer will work on safety features :
Internal users : all internal users involved in the Trust and Safety effort. Safety Directors, Managers, Analysts, Specialists, etc. To give you a few examples, for those users, a Product Designer will work on features :
To create a safe place online and build trust within a community, the option that is the best for the users is always the one that should win. Product Designers, who are working on the ground, using plenty of methodologies to understand users, are some of the best user advocates in a tech company. The soft skills required in Product Design are also extremely valuable in Trust and Safety : strong empathy, attention to detail, problem-solving skills, capacity to engage in difficult conversations, etc. Design is also a cross-functional activity by nature : a designer has to satisfy a goal with multiple constraints and may take into account functional, aesthetic, socio-political, economic, or legal considerations while keeping the users in mind. It’s the same for Trust and Safety where multiple teams are involved : Customer support, Policy and community standards, Legal, Engineering, etc.
Companies and products are always exploring new areas and thus constantly confronted with new legal and cultural local norms. Problems unfortunately often occur when the product or feature has already been launched, facing harmful negative consequences and backlash aftermath, requiring emergency adaptations or simply a complete rollback and cancelation of an entire project. An opposite approach is to design the product with trust and safety in mind, borrowing this old concept of “safety by design” from the study of workplace health surveillance to minimize occupational hazards early in the industrial design process. All impactful tech companies have a lot to win by implementing this “trust and safety by design” approach directly in their product development process.
In Trust and Safety, little mistakes have big consequences. The decisions you are making are so important to the users and the brand that you can’t sacrifice efficacy over efficiency as it’s common to do, especially in startup tech companies that are facing limited engineering resources. Product Design can improve the Trust and Safety operational productivity by getting into processes, user experiences, and user interfaces details, searching for optimizations. To give you a precise example, we noticed at Yubo from our operational metrics that for a type of issue “A” sent for a human-based review, 98% of the enforcement actions taken were of type “Z”. Instead of letting safety specialists choose among the N enforcement actions type (requiring two clicks on the interface), we defined the “Z” enforcement action as the primary action, making it directly available in one click. So 98% of the “A” issues are now reviewed in one click and the lasting 2% in two. This economy of a few microseconds by single review, multiplied by tens of thousands or hundreds of thousands decisions per day and applied to all the N other types of issues, generated strong operational gains resulting in a positive impact for the community.
Detection and enforcement technologies are extremely powerful combined with human-based reviews thanks to the learning loop they can create. Models based on artificial intelligence are built and trained to pro-actively detect violating content and behavior and remove content or restrict accounts before it affects the community on a larger scale. Technologies look for issues on pictures or learn to understand text in multiple languages. Yubo is a live video streaming application, so live detection is critical for the company. If a user violates the community standards while being live, it can immediately affect an audience while posting an inappropriate picture on Facebook or Instagram takes longer to spread and affect the audience. With high confidence scores, enforcement actions can apply automatically and preventively. With lower scores and more complex issues, involving a lot of contextual factors to understand, human investigations and reviews are needed. A Product Designer can help to make those human-machine collaborations work smoothly. First, artificial intelligence outputs can be turned into comprehensive pieces of information for non-technical moderators. It's also worthy to approach the question of the moderator working environment with designers that can come up with innovative mechanisms to implement in the tools to limit the negative impacts caused by a regular work with disturbing graphic content.
This one-year long mission with Yubo has been a great opportunity for me to discover from the inside a topic I have been more and more interested in recent years. The hard work and challenging complexity of both product design and product organisation matches greatly my professional interests and I hope to help other companies where trust and safety are at high stake soon.
If you are working in Trust and Safety and are interested in my work, please contact me.