Instagram is developing a filter against unwanted private messages, including nude photos. For this, the app uses a system with artificial intelligence. In addition to nude images, the filter can also automatically recognize and stop threats.
The new feature should “help people protect themselves from unwanted DMs,” said a spokesperson for Instagram parent company Meta. A US survey last year found that 33 percent of women under the age of 35 have been sexually harassed online.
The filter becomes an option that users can enable. Even if they have set that up, they can still choose to view a blocked photo.
The feature works without Instagram scanning the photos on its servers. The app uses technology on the user’s device, a Meta spokesperson said. “We’re working closely with experts to make sure these new features protect people’s privacy.” DMs between Instagram users are encrypted and visible only to the sender and recipient.
According to Instagram, the development of the filter has recently started, but it is not yet tested among users. Meta says it will soon announce more about the function, which will be called Nudity Protection.
Instagram has had a “Hidden Words” filter for years, which allows users to automatically filter DM requests containing unwanted terms.