Sometimes technology makes mistakes, and sometimes humans do. That’s why we use both to help ensure the quality of content that is being uploaded to Wink. Our technological and real-life moderation solutions include:
- Software that can detect inappropriate or dangerous images that are uploaded to any profile on Wink.
- Technology that ensures a user’s first photo is of their face.
- A 24/7 content moderation team that reviews profile images, bios, and user reports.
If a Wink user breaks any of our Community Guidelines, either our technology or the moderation team may handle it. This could result in the removal of content or banning of a user entirely. The Wink team may also report any potentially illegal activity to law enforcement.
We are continually working to improve our technical tools and moderation practices.