According to a report published by Stanford University and the Wall Street Journal, paedophile networks’ primary method for promoting and selling content demonstrating sexual abuse of children is through the usage of the social media platform Instagram.
According to the findings of researchers working at the Cyber Policy Center of a university in the United States, “large networks of accounts that appear to be operated by minors are openly advertising self-generated child sexual abuse material for sale.”
“At the moment, the most important platform for these networks is Instagram. Its functions, such as recommendation algorithms and direct messaging, assist in connecting buyers and sellers.”
A simple search for sexually explicit keywords that particularly reference children, as reported by the Journal, will lead to accounts that utilize these terms to advertise content that shows the sexual abuse of youngsters.
According to the information provided in the article, the accounts frequently “claim to be driven by the children themselves and use overtly sexual pseudonyms.”
Although it does not explicitly state that they sell these photographs, the accounts do contain menus with a variety of options, which in some circumstances may include certain sex activities.
Researchers from Stanford also discovered advertisements for videos featuring bestiality and self-harm.
The story went on to say that “for a certain price,” minors can be made accessible for in-person “meetings.”
AFP’s request for comment regarding Instagram was not immediately met with a response from the parent company, Meta.
According to the Journal, the social media giant admitted that there were difficulties within its security services and stated that it has established a task group in order to address the concerns that were brought up.
Pension and investment funds lodged a complaint against Meta in March of this year, accusing the company of having “turned a blind eye” to photographs of child sex abuse and human trafficking that were posted on its platforms.