Why you need to add a trust and safety officer to the leadership team

Companies need one person in charge of creating a consistent user experience that is strong on safety and trust.

Image: Adobe Stock

As trust becomes less of a nebulous idea and more of a competitive advantage, it might be time to add a new role to the executive team. Trust and safety officers can keep an eye on the entire user experience and make sure it is comfortable and not creepy. These leaders can build community policies and guide moderation teams to create safe, online spaces.

Tiffany Xingyu Wang, chief strategy and marketing officer at Spectrum Labs, said trust and safety officers need to be part of the executive team.

“I passionately believe that the trust and safety officer needs to be part of the C-suite, as user safety issues can have a terrible impact on a brand and its reputation, and can even be business-ending,” she said.

Wang also recommends that the person in this role have a company-wide goal related to user safety and work with executives in product, engineering, marketing, legal, human resources, privacy and security to achieve it.

”This is not the status quo at present, but it’s where the industry is quickly moving toward,” she said.

Akif Khan, a VP analyst at Gartner, said the role of a trust and safety officer varies from company to company, depending on the scope of trust and safety at each organization.

SEE: Password breach: Why pop culture and passwords don’t mix (free PDF) (TechRepublic)

“In some instances, teams responsible for trust and safety have largely focused on protecting the integrity of user accounts and preventing account takeover, preventing manipulation of marketplaces by buyers and sellers, and also preventing payment fraud, and so have been aligned under the head of digital commerce or head of digital channels,” he said. “In others, where the trust and safety focus is broader and extends into creating safe spaces for user generated content, I’ve seen alignment under a head of customer experience.”

Khan said this role will become increasingly important as more virtual worlds come online and companies have the opportunity to differentiate themselves via their approach to trust and safety.

“To do that will require having specific leadership in place from the outset, it can’t happen as an add-on once the virtual worlds gain momentum and behaviors become entrenched,” he said.

Khan said that CEOs need to have someone such as a trust and safety officer responsible for  customer experience to ensure that:

  • The brand’s digital spaces align with customer expectations of integrity (no fake reviews)
  • Digital spaces are inclusive and designed to be accessible to all
  • Users feel safe in an environment with no abusive comments or trolling and safeguards for personal data

“If that isn’t all aligned under one person’s responsibility, things risk falling between the cracks as many of these areas don’t have a natural home otherwise within the organization,” Khan said.

Because this trust and safety role is so new, there’s not a lot of formal training for the job. People in these roles need a varied set of skills, Wang said, ranging from product development, engineering and operations to an understanding of legal, compliance and moderation best practices.

When hiring people to work on trust and safety issues, Wang suggests looking for individuals who can see the big trust and safety picture, as well as respond to individual tickets. This means generalists who can handle a wide range of situations from responding to legal subpoenas to angry parents concerned that their children are being harmed on the platform.

“Later, when you start implementing more proactive and preventative measures, the trust and the safety department may separate into two distinct teams, one that responds to the tickets and another to handle proactive measures,” she said.

SEE: Mobile device security policy (TechRepublic Premium)

Assembling the right team and the right tools

Trust and safety leaders need a strong community policy to be effective in their work, according to Wang.

“It determines what’s acceptable, what’s a cause for action and what that action is when it is applied, along with any mechanism for appeal,: she said.

She lists these three elements of all good policies:

  1. Platform policy: This outward-facing policy for users covers community guidelines, rules and enforcement, including specific issue areas such as minor safety or hate speech.
  2. Product policy: This internal policy describes how to develop features and user experiences as well as how trust and safety product features work within the app.
  3. Public policy: This covers trust and safety considerations required by government regulations within a specific country.

Trust and safety officers also need the right team to be successful. Wang recommends that this group include these team members:

  • Moderators: To monitor content, review user posts, and take action on content that violates community guidelines.
  • Data labelers: To manage training data for  AI models and develop tests to identify positive and negative examples of hate speech so that models can learn the difference.
  • Analysts: To produce internal and external reporting of how effective a company’s trust and safety measures are.
  • A legal resource: To track relevant regulations and to handle subpoenas you will receive and serve as part of your company’s law enforcement response team.
  • A project manager: To work with product, engineering, communications and legal organizations and make sure everyone is on the same page and completing tasks for success.

Spectrum Labs is hosting the Safety Matters Summit on May 18-19, a virtual event that will  feature brand and user safety experts from Mastercard, Riot Games, ABInbev, Twitter, Roblox and TikTok. Speakers will discuss trust in the metaverse era.