Oct 24, 2018
Facebook removes 8.7m child nudity images in three months
Facebook has said that 8.7 million images of child nudity were removed by its moderators in just three months. Another program can also detect possible instances of child grooming related to sexual exploitation, Facebook said. Of the 8.7 million images removed, 99% were taken down before any Facebook user had reported them, the social network said. Now, Facebook's global head of safety Antigone Davis has said that Facebook is considering rolling out systems for spotting child nudity and grooming to Instagram as well. "What Facebook hasn't told us is how many potentially inappropriate accounts it knows about, or how it's identifying which accounts could be responsible for grooming and abusing children" said Tony Stower, head of child safety online for the NSPCC. "And let's be clear, police have told us that Facebook-owned apps are being used by groomers to target children."
Make a complaint about Facebook by viewing their customer service contacts.