Child Safety Standards
Last updated: April 2026
Genana has zero tolerance for child sexual abuse material (CSAM) or any form of child sexual exploitation, grooming, or solicitation of minors. This page describes the standards we enforce and how users can report violations.
What's prohibited
- Any content that sexualizes minors — real, AI-generated, illustrated, cartoon, or otherwise
- Content that grooms, solicits, or endangers minors in any way
- Sharing links, contacts, or platforms that facilitate child sexual exploitation
- Attempts to circumvent our safety filters to produce prohibited content
How we enforce it
- Users can report any post or user in-app via the “⋯” menu on the post or the Report button on a user's profile. “Nudity or sexual content” and “Abuse or hate” are dedicated report reasons.
- Reported content is reviewed by our moderation team and removed if it violates this policy.
- Accounts that post CSAM or attempt to exploit minors are permanently banned.
- Confirmed CSAM is reported to the National Center for Missing & Exploited Children (NCMEC) as required by U.S. law (18 U.S.C. § 2258A).
- We also use input prompt filters and output content filters on generated images to prevent the creation of prohibited content.
How to report
If you encounter content or behavior that may violate these standards, please use the in-app report tool or email us directly. You do not need an account to report concerns.
- In-app: Tap the “⋯” (More) menu on any post or open a user's profile, then choose Report.
- Email: [email protected]
Contact
Our designated contact for child safety concerns is [email protected]. We review every report and respond within 48 hours.
Read our Terms of Service and Privacy Policy for additional information about how Genana operates.