Bumble Unveils Deception Detector AI Tool to Boost User Safety

Bumble, a leading dating app, has rolled out a new artificial intelligence (AI)-based feature called Deception Detector, aimed at bolstering the safety of its user community. Designed to identify and block spam, scams, and fake profiles proactively, the feature was launched in commemoration of Safer Internet Day on February 6. During its testing phase, the dating app reported a significant reduction of 45 percent in user reports of spam, scams, and fake accounts.

According to Bumble’s press release, a survey conducted last year with a global sample size of 28,000 individuals revealed that fake profiles and scams were the primary concerns among respondents. In India, 29 percent of participants expressed a desire to safeguard their personal data, while 28 percent cited concerns about scams when meeting someone for the first time. These findings prompted the development of Deception Detector.

Utilizing a machine-learning-based model, the AI tool assesses the authenticity of profiles and their connections on the dating platform based on various parameters. If a profile is flagged as spam or a scam, the tool automatically blocks it. Bumble asserts that 95 percent of accounts identified as fraudulent during testing were promptly blocked by Deception Detector. Nevertheless, the tool is complemented by human support to mitigate the risk of any malicious accounts slipping through undetected or innocent accounts being erroneously banned.

Lidiane Jones, CEO of Bumble, emphasized the company’s commitment to fostering genuine connections and empowering women to initiate interactions. She stated, “Deception Detector is our latest innovation as part of our ongoing commitment to our community to help ensure that connections made on our apps are genuine.” Jones underscored the importance of trust in the AI era and reiterated Bumble’s dedication to prioritizing women’s online experiences, emphasizing that AI remains a key focus area for the company.

This isn’t Bumble’s first foray into AI-driven safety features. In 2019, the app introduced Private Detector AI, which automatically obscures explicit photos sent between users and issues alerts accordingly. Subsequently, Bumble released an open-source version of the tool on GitHub for broader community use.

Be the first to comment

Leave a Reply

Your email address will not be published.


*