Our Commitment to Wellbeing & Safety

Wisdom and the Oasis Consortium

Wisdom has joined forces with fellow global tech leaders to become a member of the Oasis Consortium – a first-of-its-kind think-tank dedicated to making the internet safer and more inclusive for all.

The Oasis Consortium is dedicated to advancing digital sustainability through ethical standards and technologies. It brings together global thought leaders and practitioners representing every facet of the internet.

The group published the Oasis Consortium’s User Safety Standards – a first-of-its-kind set of operating principles for businesses and leaders of this new digital age to follow. Wisdom has committed to following these safety standards, and it sits alongside other tech leaders – all dedicated to building a safer, more inclusive Metaverse and Web 3.0.

Oasis Consortium President Tiffany Xingyu Wang said of Wisdom: “Oasis User Safety Standards provide a roadmap for emerging platforms to build out trust and safety teams, programs, and tech stack. With more and more innovators like Wisdom signing up to the Standards, we can build digital sustainability into Web3.0 from the get-go.”


 
istock_matter.jpg
  1. We have one North-Star metric: conversations that matter.

While so many apps emphasize images or videos, we know attractiveness is just one small part of what real people care about: a sense of humor, intelligence, honesty, kindness — all of these virtues matter even more than appearance. Every decision we make is examined through one lens: does it drive conversations that matter? 

istock_vulnerability.jpg

2. Voice invites vulnerability.

Vulnerability is the key to every great conversation. No good conversation is one-way, and there can be no give-and-take without a willingness to receive … without vulnerability.  Voice brings out the best in each person, and what higher purpose does an app have than to bring out the angels of our nature, rather than the demons? 

istock_trolls.jpg

3. Voice discourages trolls.

If the medium is the message, voice is fundamentally safer and more beneficial to well-being than text and photos. Trolls can reach people far and wide with text and photos, and the more outrageous their behavior the more their message will travel. With voice and the inherent back-and-forth of the medium, the trolls come across as what they are … sad, troubled people who are to be pitied rather than followed. Because we have active moderation and access to recordings, the troll’s message does not go far and wide.

istock_antidote.jpg

4. Voice is the antidote to weaponized distraction: PUT YOUR PHONE DOWN!!

Voice is magical for allowing deep, substantive, long-from conversations. Long-form conversational content is the opposite of the distraction machine that most popular social apps are engineered to be. We don’t have neural networks trained to fracture the human attention span through mindless tapping and constant context switching. We train our neural networks to find one thing  … a conversation that matters to you … and then we design the experience to keep you focused on it. Ideally, you won’t even look at the screen, and for god’s sake, you’ll put the phone down :) 

istock_authenticity.jpg

5. Voice demands authenticity.

People want to engage in authentic interactions online. They don’t want to be swiped away. They don’t want to feel like part of someone else's numbers game. Attractiveness is just one small part of what real people care about: sense of humor, intelligence, honesty, kindness — all of these virtues matter more. Voice powerfully communicates these attributes. Combine voice with a real name, and you have a powerful platform for authentic interactions. 

istock-identity.png

6. We intentionally add friction to the signup process.

Most app makers are all about removing friction to account and content creation. We add friction. We don’t allow people to come in anonymously or register an account with a simple email address. In 2021, we think it’s quite clear that anonymity did not help promote anything worthwhile on the Internet. That’s why we require mobile phone numbers or connection to existing identity services like Instagram, Twitter, and LinkedIn. When we detect a bad actor, we have more tools at our disposal to keep them out. 

istock_247.jpg

7. We have 24x7x365 moderation.

Our servers never sleep, and neither does our moderation. We maintain 24-hour-a-day, 365-day-a-year coverage of our abuse reporting systems.  What’s more, we strive to gather high-quality abuse reports so that action can actually be taken if there is a violation of our code of conduct.

istock_code.jpg

8. We maintain a clear, plain-language code of conduct and we enforce it.

Our code of conduct is easy-to-read and sets forth what is not acceptable. This is a living document that evolves with the behaviors and needs of the community. On apps where Creators can reach people they don’t know, we require acceptance of these terms every time they go live, and we then remove accounts and ban community members who violate those terms. We don’t particularly care about de-platforming anyone. We care, rather, that our community is kind and inviting. 

istock_collaboration.jpg

9. We collaborate across the industry to promote safety and wellbeing.

We are working with specialist providers of voice algorithmic moderation services to, in the future, supplement our abuse reports with additional meta-data that we think will be helpful in moderating content. We leverage other companies’ technologies whenever we can to improve the safety and security of the service. We are committed not only to following best practices but also to inventing the next best practice. We know “best practice” is always evolving. 

istock-ten.jpg

10. We know why we started this company.

We create social technology infused with humanity, not to monetize it, but to enhance it, to shape it. We do not reduce individuals to so many bits and bytes. We amplify vulnerable voices, shaping our shared story around the better angels of our nature. You will know our apps by their humanity.