Orchard Handbook: Community Guide for Identifying Sexual Predators and Victims Online (Version 1)
Authored by B. M. Vos (bauhinia), with additional input from Amber. CC BY-NC-SA 4.0
Clarification: The following guide has been based on thorough observation and investigation of predator circles on Tumblr, and partially on other online spaces. While in fact, they do operate differently based on the platform used due to their unique nature and the scope of the staff's focus on preventing CSA, the information here can be of use to identify universal patterns of the offenders and victims in order to prevent sexual abuse rather than applying reactive solutions.
Introduction
It is nothing new that predators on the internet exist. The wider anonymity of the internet compared to real life, as well as the ability to connect with virtually anyone in the world has allowed sexual crimes to flourish here. Many community managers act reactively to incidents of child sexual abuse (ie. reporting CSAM to authorities and to social media platform staff to handle it) - that is because a lot of them aren't simply trained to detect early signs of predatory behaviors online, thus are unable to act proactively to such incidents.
One major problem in acting proactively is that due to how sexual predators have infiltrated many communities and fandoms due to them usually having a significant percentage of minors, it can be incredibly hard to tell apart a predator from someone innocent without having appropriate context clues. This problem has been highlighted many times in communities especially vulnerable to potential demonization by the average population with a history of false accusations of predatory behaviors targeted towards them.
Another problem in countering these problems is that due to the helplessness of community managers, members themselves often take matters into their own hands, often in an irresponsible and sensational manner, leading to the phenomenon of pedophile hunting - it is important to note that more often than not, these are motivated by having an acceptable target to abuse and gaining fame within community members rather than keeping communities safe.
In all cases, the safety of victims and community should come first and foremost. Proceed with everything with uttermost care, and remember to keep track of context clues, because nothing is black or white. Irresponsible actions may harm innocent bystanders or dissolve entire communities. Child sexual abuse is a serious crime that should not be taken lightly.
Symbols used by predators
The following section is dedicated to symbols and terminology used by sexual predators, as well as circles adjacent, but not directly linked to pedophiles.
Remember the rule of context clues - some terms may find their use in other, unrelated circles and contexts; those will be noted appropriately here. Especially in cases of emoji combos, these may be frequently misunderstood.
- PEAR, and/or the π emoji: stands for Pro-Expression, Anti-Repression, used in circles encouraging expressing pedophilic attractions or desires.
- Radqueer, and/or the ππ emoji combo: stands for radical queer - people identifying as such often advocate for public expression of one's paraphilias, especially pedophilia, zoophilia and necrophilia, as well as encouraging the usage of pro/anti-contact stances.
Note that while they are least likely to be active sexual abusers, it is worth noting the radqueer circles are notorious in infiltrating the LGBTQ+ communities, being one of the catalysts of the narrative that queer people are sexual predators in the modern day. - MAP, and/or the πΊοΈ emoji: stands for Minor-attracted Person or in other words, a pedophile.
There is an offshoot acronym, NOMAP, standing for Non-offending Minor-attracted Person.
NOTE: The acronym MAP also stands for Multi-Animator Project. The emoji may also be used in the context of genuine interest in geography and maps. - Cunny: a term used by sexual predators to describe an AFAB minor's genitals. Also seen commonly in lolicon circles.
- ππ’: an emoji combinations commonly used in lolicon circles, though its exact meaning is unknown as of writing this.
- AoA: stands for Age of Attraction, a way for predators to describe their preferred age of a victim. Note that the concept of preferred age is not a predatory sign in itself, as we all have our own preferences, but specifying it in public, especially when the number is below 18, is not used anywhere outside of pedophile circles.
- Nepi: a term referring specifically to CSAM concerning toddlers and newborns, derived from nepiophilia, an attraction to infants and toddlers.
- Cheese pizza or the π emoji: Coming from the CP acronym meaning "child pornography", a way to circumvent some censorship mechanics; a dogwhistle.
- Youth liberation: A stance echoed within many predatory circles, advocating for abolishing of age of consent and giving minors full sexual autonomy, implying that consent laws and any attempts to protect minors from sexual abuse is oppressive to them.
Note that this term is also used outside of predatory circles discussing on how to grant minors better protection without relying on parental power, for example in situations that deal with abusive parents. - MAP pride flag:
Used usually on profile pictures, banners, or as custom inline emojis, depending on platform. May also be represented as series of colored emojis making up the flag colors.
- Numerous symbols uncovered by FBI that were used by pedophiles in the 90s and 2000s are still in use today in some circles:
Symbols used by victims
This section is dedicated to symbols and terminology used by (potential) victims of online child sexual abuse.
- AAM: stands for Adult-attracted Minor, which is self-explainatory. Those people may have already been groomed into a predatory circle to some degree.
- Age followed by a π or π (eg. 41π): this is used to disguise the minor's actual age. The purpose of the emoji is to hint that in order to know the actual age of the minor, the digits are to be put in reverse order. Other such method of obscuring a minor's real age, especially in introduction notes, is clarifying the amount of "plushies (or other objects) possessed" or disguising it as one's favorite number.
- Offering "menus": Some minors may attempt to sell nude photos of videos of themselves, commonly calling the pricelist a "menu". It is important to remember there are also adult sellers of nude content of themselves. Confirm the target's age before proceeding.
How do they operate?
While as mentioned in Introduction, predators operate in different manners depending on the platform due to their unique nature, it is important to know at least some of them, as in platforms with shared traits they may operate in a similar way.
Federated networks
On federated networks, such as the Fediverse or Matrix (to a certain degree), predators tend to congregate in their own spaces rather than reside on general use servers. The fact instances are self-hostable by anyone with intermediate understanding of how computers work makes it easy to create semi-closed communities solely dedicated to predatory activities. While plenty of them opt for allowlist federation (that is, manually enter a list of domains that are able to receive from and send to said instance) to prevent unauthorized use, some decide to keep federation wide open, potentially to expose minors to predatory content and beliefs.
Centralized platforms
On centralized platforms, such as Tumblr or Twitter, the situation looks quite different. Predators are mostly invisible to an average eye - instead, on the surface you will see pretty much only minors. To circumvent censorship filters on platforms, and to target the audience, hashtags containing key words such as "pedo dad", "icky kiddo" "MAP/AAM" will replace some letters with digits, akin to leetspeak.
Predators on centralized networks tend to keep the profile low, as it would be easy for them to get terminated and reported to authorities; but some activity such as liking a minor's posts or reposting can be still done by them. More often than not, thanks to minors exposing themselves, offenders will directly message victims instead of keeping interactions public.
What next?
Regardless of what manner are you dealing with, there are common points in the work of offenders.
One such being moving conversations with minors to external chat apps, often being encrypted and fully anonymous; of which the most notorious is Session.
The problems with Session
Session is an encrypted, anonymous chat service utilizing TOR to relay messages between people, launched in 2020. Initially starting as a fork of Signal Messenger, it now became its own independent thing, utilizing its own infrastructure and encryption.
Due to its focus on anonymity, Session doesn't ask for an e-mail address or a phone number upon registration; and while users can choose their own display name, they are also assigned Session IDs, which is a string of randomized alphanumeric characters that they can use to add each other to contacts and start new conversation.
And because of that, Session is notoriously associated with pedocriminality, as there is no reliable way to report users for sharing illegal content, no way to identify users unless they willingly put a Display Name that can be associated with their public accounts on other services; and the developers of Session themselves do not seem to care about the fact their product is used to encourage child sexual abuse.
The service is engineered in a way to encourage these activities, and it is frankly disingenous to assume the developers act in good faith, given they do not provide any tools to combat spreading CSAM or soliciting minors.
Why is that important?
As mentioned previously, Session is playing a core part in child sexual abuse due to how it is engineered. Regardless of what manner you are dealing with, it is common for offenders and victims alike to share their Session IDs on platforms in order to move conversations away from places where those could be monitored and/or reported.
Given other context clues, it is safe to assume that a Session ID in one's profile biography or a post is a sign of being involved with child sexual abuse, be it on the offending or abused side.
There are, of course, other messenger apps in use, such as Teleguard, Telegram and even Discord, but in their case, there is a way to retrieve identifying information about the participants of a chat, and in addition they are very commonly used for unrelated, innocent cases that do not concern topics of chiuld sexual abuse.
Advice on dealing with cases of CSA(M)
I witness an instance dedicated to predatory actions/circles! (Federated networks)
As instances can be self-hosted, it is possible that a report to hosting providers can be made, in addition to a CyberTip, and that may be enough. Most hosting providers have rules against illegal material on their datacenters, and are obligated to forward this to authorities, else it would make them complicit in the crime.
By most laws, the task of keeping up evidence is the job of a hosting provider. Keeping copies yourself is not advised as those may get you in legal trouble, especially if evidence concerns CSAM.
Sometimes it may turn out that the hosting provider is actually okay with such content on their servers; in that case a report to authorities is advised. Please do make sure to thoroughly research and hand over as much information as possible to them to make the investigation easier. A report to a domain registrar is also highly advised.
As a community manager, your duty is to also block offending instances on your side and scrub all stored posts/media to avoid endangering yourself with potential charges of storing CSAM. Before that, backup all evidence necessary in case of making a report to authorities, especially if the offender originates from your community. Take extra caution if your community also has minors in it.
I witness predatory behavior on a platform (Centralized platforms)
Reporting the offending profiles (defined as both offenders and victims alike) to the platform's staff is required.
Sadly, as for years centralized platforms have displayed insufficient actions taken to prevent this from happening, or even halting the phenomenon, it is highly recommended to make a CyberTip report about it. Keep in mind that saving the evidence is the job of hosting providers, not users - therefore platform staff are obligated to keep the records even after offending profiles are terminated or deleted at user's request.
How do I protect my community in general?
Regardless of which scenario applies to you, the KIER (Keep, Inform, Ensure, Report) rules can help you apply proactive measurements to keep your spaces safe:
- Keep an eye on the following things, as they are often early signs of predatory behaviors:
- used dogwhistles or symbolism, as these are designed to signal to others in similar circles their affilation and stances while not attracting the attention of outsiders.
- interactions between adults and minors; while there is nothing inherently wrong with those, as they can bring in plenty of positive results, it is important to recognize signs of grooming such as discussions of sex, romance, especially on the adult's side, and a concerning interest in sexuality of a minor. These topics should be discussed only in controlled environments, with people who are trained and tasked to converse about such topics, like doctors, teachers or parents.
- minors out of adult-oriented spaces; a good amount of predators operating both on federated and centralized platforms lure in minors who are in adult-oriented communities, since in such, there is a tendency to have lower alertness, as everyone assumes their local peers are also adults. Note that CSA is very likely to happen in all-ages communities as well.
- Inform about internet safety; while it will not make you and your (underage) peers safe 100% of the times, it is crucial to keep repeating the rules of internet safety, such as not exposing all your sensitive information to the public like age, location, education level and such.
- Ensure you are mentally prepared for community management; it is not an easy job, and approaching highly sensitive topics such as sexual abuse/grooming of children is taxing on one's mental health. In order to be a responsible community leader, you need to be prepared for such things to happen. Dealing with predators when you are not doing well can prevent you from acting appropriately and ensuring things go well. It often helps when community management is done by a team of people rather than a single person.
- Report any suspicious signs to trusted people; More often than not, dealing alone with child sexual abuse signs is difficult. Having trusted people, such as other community managers or friends can help you weed out false alarms or confirm your suspicions. Acting carefully is more important than firing shots as a suspect, as you may make wrong judgements of someone who is actually innocent.
I identified a potential victim - what to do?
There is no one easy guide to victim handling, as there are several things that can make it more complicated - in some cases the victim may be at late stages of grooming, where reaching out to the person affected may cause them to isolate even further from the outside world.
In order to ensure the process is safe, consider the following circumstances:
- Is the victim your friend? If yes, it will already be easier to talk to them than if you were a stranger, as there is already some level of trust involved.
- If not, do you know the victim's friend(s)? Reaching out to someone your victim may trust is also a good idea, especially if you don't have any estabilished relations with the victim.
- Are you aware of their family relations? If you know that the victim's relatives or nearest environment is safe, and you are already in contact with the victim, recommend them to reach out to someone off-line that they can trust, such as parents, teachers, or other trusted adults. It is important to not make such suggestions when you are unsure of the victim's relations with their family or if their environment may be hostile to such an event, to avoid risking potential victimshaming.
- Ensure the victim that they are not at fault. More often than not, after learning that they have been sexually abused, victims may experience internal shame and blame themselves for what happened. It's crucial to explain to them that it is not their fault, and that the blame is on the adult for grooming and abusing them. If possible, and if circumstances allow this, provide the victim mental support or consider recommending them to see a mental health therapist specialized in sexual abuse trauma.
It is highly recommended to pursue the victim to cooperate and file a report on the offender - however the process is known to be retraumatizing for many, and as such additional measurements should be taken, such as the victim getting proper mental healthcare and support from their real-life environment.
Addendums
Lolicon, shotacon, cub VS CSAM
While loli, shota, and cub content is not the same as CSAM, as only the latter involves direct abuse of minors; it is important to understand that LSC is a vital part of grooming minors into predatory circles and normalizing child sexual abuse under the guise of fictional material.
It is vital to know that LSC should not be reported to CyberTip, as cases of real CSA take priority over fictional, albeit obscene and immoral material. However, depending on the jurisdiction you/your community is operating in, possession, distribution or creation of such material is still considered illegal. Explicit exclusion and moderation of such content is still a crucial part of online safety of minors.