Why a safe metaverse is a must and how to build welcoming virtual worlds

Why a safe metaverse is a must and how to build welcoming virtual worlds

[ad_1]

If harassment in virtual worlds is anything like what people have to put up with now, no one will want to hang out in these spaces for very long. Tiffany Xingyu Wang, chief strategy and marketing officer at Spectrum Labs, predicts that the harassment and personal attacks that 41% of the U.S. internet users have experienced online will get worse in virtual worlds. This includes everything from name-calling and purposeful embarrassment to physical threats and stalking.

“The metaverse is immersive and multisensory, which makes the impact much bigger,” she said. “The lead time to toxicity is much shorter.”

Wang thinks it might be too late for legacy social media platforms to solve the problem of online harassment. She sees the growing number of virtual worlds as a chance for the tech industry to get it right this time. Companies that build safe and responsible communities will have a competitive edge in the metaverse.

“Companies can start with safety by design and use that as a differentiator and attract people that way,” she said. “New platforms will win because they do those things very well.”

In addition to protecting metaverse residents against theft and financial scams, there must be basic personal safety protections as well.

Avoiding a repeat of Web 2.0 problems means developing a shared code of conduct that includes consequences for breaking those rules. Spectrum helps companies build AI infrastructure to establish trust and safety on websites in several sectors, including social media, dating, gaming, e-commerce and edtech. Customers use the company’s content moderation tools to identify hate speech, racism, child sexual abuse material and other content that violates terms of service agreements.

Wang said that community policies are more than an insurance policy or an unavoidable cost center.

“It can be the reason people come to your community which can reduce customer acquisition cost and increase retention,” she said. “There’s a business case for trust and safety.”

How to develop community policies

Bully in smartphone harassing, threatening and intimidating upset victim online. Cyberbullying, online flooding, social network harassment concept. Pinkish coral bluevector isolated illustration
Image: Visual Generation/Adobe Stock

 

Any corporation, individual or organization running an immersive community online will have to prioritize writing solid policies and enforcing those rules of behavior, according to Wang. The key is doing both tasks well.

“Often when we see scandals, the problem is that companies have policies but they don’t enforce them,” she said.

The Trust & Safety Professional Association is a non-profit organization for professionals who develop and enforce principles and policies for acceptable behavior and content online.

Wang said that gaming and social companies are pioneers in this space but that the pandemic accelerated interest in the work the Trust and Safety Foundation is doing.

Another key to success is writing a code of conduct that is designed for the group using the platform. For example, policies for a LGBTQ dating site would be very different for a gaming site designed for 9- to 13-year-olds.

“The companies that do this well hire a policy team to work with marketing and comms to write a policy that reflects the brand identity,” she said.

Community policies also guide the appropriate consequences for people who break those policies, Wang said.

“You can’t build enforcement rules without knowing the fundamental code of conduct,” she said.

SEE: Qualcomm and Lenovo launch new augmented reality partnership to expand developer ecosystem​

Spectrum imbeds rules into moderation rules enforced and automated by artificial intelligence. This approach means moderators have less exposure to violent and offensive content.

The other vital component in building a safe online community is transparency, Wang said.

“If you take action, if you suspend a user or you take a person off the platform, you have to tie it back to policy,” she said.

Wang recommended a new white paper from Grindr as a good example of how to develop content moderation strategies. Three trust and safety leaders at Grindr wrote the paper “Best practices for gender inclusive content moderation,” which examines various content moderation decisions and the multiple factors to consider when setting policies.

Trust and safety experts Vanity Brown and Lily Galib, and Alice Hunsberger, the senior director of customer experience at Grindr, explain how to design comprehensive policies, review best practices for inclusive moderation and offer resources for moderators and users.

Who sets the rules in virtual worlds?

Hands holding a tree growing on coins Invest in stocks, save money, grow financially 3d illustration metaverse Virtual World

That need for a shared code of conduct is related to the security problem: Who’s in charge here? As Ahmer Inam, chief AI officer at Pactera Edge describes it, part of the challenge is that there is no clear enforcing entity for metaverse rules.

“Virtual worlds are completely borderless, so whose laws apply?” he said.

No one wants to extend the authoritarian tendencies of many governments to any multiverse world. However, if theft and scams and general criminality are everywhere in these virtual places, that also will discourage wider adoption.

Inam thinks a partnership between public and private entities should develop a shared set of rules about any metaverse world.

He sees potential in learning from concerns about nuclear power that led to international treaties to govern the technology with trustworthiness as a central organizing principle.

SEE: Metaverse cheat sheet: Everything you need to know (free PDF)

Inam predicts that once regulated industries start to build metaverse experiences, regulations will come faster. He also thinks that ethical AI charters could be expanded to cover the metaverse, but that will require a partnership between industry and government.

“The growth of the metaverse could accelerate not just the tech but an individual bill of rights and privacy because of the potential for harm that exists,” he said. “As a technologist, I’m excited for what’s to come, but as a citizen of our society, I’m a little bit concerned.”

The pandemic showed that internet connectivity is as important as electricity, and prompted government investment to expand access to communities with no high-speed access. James Arlen, CISO at database-as-a-service company Aiven, thinks something similar will happen with digital identities.

“Identities should be federated outside of a digital, for-profit company,” he said.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *