Child Safety Standards – Droll Pics

Application name: Droll Pics – Sell Your Selfies
Developer: ShadesofRomeo Ltd
Last updated: December 17, 2025
Zero Tolerance for Child Sexual Abuse and Exploitation (CSAE)
Droll Pics – Sell Your Selfies strictly prohibits any form of Child Sexual Abuse and Exploitation (CSAE).
This includes, without limitation:
- The creation, sharing, solicitation, or distribution of Child Sexual Abuse Material (CSAM)
- Grooming, manipulation, or exploitation of minors
- Any sexualized, suggestive, or exploitative content involving minors
- Any attempt to bypass age, identity, or legal safeguards
Any CSAE-related activity results in:
- Immediate content removal
- Permanent account termination
- Reporting to relevant authorities, when required by law
Our Commitment to Child Safety
At Droll Pics, the safety and protection of children is a top priority.
We maintain zero tolerance for any form of Child Sexual Abuse and Exploitation (CSAE) and enforce strict rules, moderation systems, and legal safeguards to prevent harm to minors.
These Child Safety Standards apply to all users of Droll Pics – Sell Your Selfies and are designed to fully comply with Google Play Child Safety Standards and applicable laws.
Age Policy and Access Control
Droll Pics applies strict age-based access controls to protect minors.
- Users must meet the minimum legal age required by law in their country or region to access the app
- Users who do not meet the legal age requirements are not permitted to create an account or use the platform
- Droll Pics does not knowingly allow access that violates local child protection or online safety laws
Account Access & Identity Verification
To ensure accountability and prevent misuse:
- Access to Droll Pics requires account creation using a valid email address
- Users must agree to the Terms & Conditions before using the app
- Anonymous access is not permitted
- Identity verification (KYC) is required before:
- Requesting payouts
- Receiving earnings
- Age eligibility and legal capacity are verified during the verification process
Users who fail verification or do not meet legal requirements cannot access monetization features.
In-App Reporting and Community Safety Tools
Droll Pics provides in-app mechanisms to allow users to report safety concerns:
- Users can report individual images
- Users can report user accounts
- All reports are reviewed by administrators
- Reports related to child safety are treated with the highest priority
Automated Safeguards and Content Review
To reduce exposure to inappropriate content:
- A community flagging system is implemented
- When content receives multiple user flags, it may be:
- Automatically blurred
- Removed from public visibility
- Excluded from the main feed and first-page experience
- Placed in a restricted section pending review
- Flagged content remains restricted until reviewed by administrators
Moderation and Enforcement
Droll Pics enforces these standards through:
- Automated detection systems
- Manual moderation and regular reviews
- Immediate removal of violating content
- Account suspension or permanent banning for serious or repeated violations
Any attempt to bypass platform safeguards is treated as a serious violation.
Child Safety Contact
For child safety concerns, CSAE reports, or issues related to CSAM, please contact our Child Safety team directly:
All child safety reports are handled with urgency and confidentiality.
Legal Compliance
Droll Pics complies with all applicable child safety laws and regulations in the jurisdictions in which it operates and cooperates with law enforcement authorities when legally required.
Compliance Statement
These Child Safety Standards apply to all users of Droll Pics – Sell Your Selfies and are enforced in accordance with Google Play policies and applicable laws.