Instagram to introduce tool to protect users from getting nude photos in their DMs
New Delhi: Meta-owned Instagram is developing a feature to protect users from receiving nudity and pornography in their direct messages (DMs) from unknown people.
An app developer Alessandro Paluzzi first tweeted a screenshot of the feature.
“Instagram is working hard to protect nudity for chats. Your device’s technology includes photos that may contain nudity in chats. Instagram can’t access photos,” he posted.
Meta confirmed to The Verge that such a feature is being developed to protect the privacy of Instagram users.
“We’re working closely with experts to ensure these new features protect people’s privacy while allowing them to control the messages they receive,” a spokesperson for the company said. quoted company said.
Meta says that the technology will not allow it to view the actual messages, nor share them with third parties.
The move comes at a time when the UK-based nonprofit Digital Hate Center found that Instagram’s tools were unable to handle 90% of abusive direct messages. based on images “sent to famous women”.
Last year, in an effort to give young users a more secure, private experience on its platform, Instagram made potentially suspicious young accounts hard to find by placing users’ accounts under 16 is private by default.
It also limits advertisers’ options for reaching young people.
The company has developed a new technology that finds accounts with suspicious behavior and prevents those accounts from interacting with young people’s accounts.