Content Moderation & Reporting Policy
Effective Date: [EFFECTIVE DATE] Last Updated: [LAST UPDATED] Version: 2.0
This Policy describes how KidStarter moderates content, processes reports, and enforces its policies. It forms part of the Agreement (see Terms of Service) and operates in conjunction with the Community Guidelines and Safety & Child Protection Policy.
1. Moderation Model
KidStarter uses a multi-layered moderation system:
1.1 Pre-Publication Review
All Campaigns involving minors undergo human review before publication. No Campaign involving a minor is published without a moderation review. This review covers: Campaign text, images, documentation, and Creator credentials.
1.2 Automated Checks
Automated systems scan content for:
- PII patterns (phone numbers, email addresses, addresses, Social Security/NI numbers, full names);
- Image/OCR analysis for visible text, documents, badges, and identifying information;
- Prohibited content keywords and patterns;
- Spam and duplicate content;
- Policy trigger phrases (e.g., medical claims, guaranteed outcomes).
Flagged content is held for human review before publication or, if already published, is immediately hidden pending review.
1.3 Post-Publication Monitoring
Published content is subject to ongoing automated scanning and may be flagged for re-review based on: user reports, updated policy triggers, or periodic sample audits.
1.4 User Reporting
Any User may report content at any time. Reports are routed to the Trust & Safety team. See Section 3 below.
1.5 Human Review
All final moderation decisions are made by trained human moderators. Moderators receive training on child safety, data protection, cultural sensitivity, and KidStarter's policies.
2. Actions We May Take
Depending on the nature and severity of the issue, KidStarter may take one or more of the following actions:
| Action | Description |
|---|---|
| Edit/Redact | Edit, crop, blur, or redact content to remove PII or safety risks (where the underlying content is otherwise compliant) |
| Remove Content | Remove specific text, images, comments, or updates |
| Pause Campaign | Temporarily hide a Campaign from public view pending investigation |
| Restrict Features | Limit specific features for a User (e.g., comments, updates, new Campaign creation) |
| Suspend Account | Temporarily or permanently suspend a User's account |
| Hold Disbursement | Place a hold on pending disbursements related to a flagged Campaign |
| Notify Authorities | Report to law enforcement, NCMEC, IWF, child protection agencies, or financial regulators as required or appropriate |
| Escalate Internally | Escalate to senior leadership, legal counsel, or the DPO for complex or high-risk cases |
3. Reporting
3.1 How to Report
- In-App: Click the "Report" button on any Campaign page, comment, update, or user profile.
- Email: [REPORT LINK OR EMAIL]
- Urgent Safety Concerns: Email [REPORT LINK OR EMAIL] with "URGENT: CHILD SAFETY" in the subject line.
3.2 What to Include in a Report
- The URL or identifier of the content/Campaign;
- A description of the concern;
- Any supporting evidence (screenshots, links);
- Your contact information (optional but helpful for follow-up).
3.3 Report Handling
- All reports are logged with a timestamp and unique reference number.
- Child safety reports are prioritized and reviewed within 4 hours during business hours, or as soon as reasonably possible outside business hours.
- Other reports are reviewed within 2 business days.
- The reporter's identity is kept confidential and is not disclosed to the reported party.
3.4 EU Digital Services Act (DSA) Compliance
Where applicable, KidStarter provides a notice-and-action mechanism compliant with the EU Digital Services Act (Regulation (EU) 2022/2065), including: easy-to-use reporting mechanisms, reasoned decisions on content removal, and access to internal complaint-handling and, where applicable, out-of-court dispute settlement.
4. Appeals
4.1. If your content is removed, Campaign paused, or account restricted, you will receive a notification explaining the specific reason and the policy provision(s) violated.
4.2. You may request reconsideration by contacting [SUPPORT EMAIL] within 14 days, providing additional context, documentation, or explanation.
4.3. Appeals are reviewed by a different member of the Trust & Safety team than the original reviewer.
4.4. KidStarter will use reasonable efforts to respond to appeals within 10 business days.
4.5. For EU users, where required by the Digital Services Act, you may also pursue out-of-court dispute resolution through a certified body in your Member State.
5. Moderator Accountability
5.1. All moderation actions are logged with: the action taken, the reason, the policy provision applied, the moderator's identifier, and a timestamp.
5.2. Moderation logs are retained for audit and compliance purposes.
5.3. Moderators are subject to internal oversight, training, and periodic quality reviews.
6. Transparency
6.1. KidStarter may publish periodic transparency reports including aggregate data on: number of reports received (by category), content removed, accounts suspended, appeals received and outcomes, and requests from authorities.
6.2. Transparency reports will not identify any individual User.
7. Contact
Report Content: [REPORT LINK OR EMAIL] Appeals: [SUPPORT EMAIL] General Support: [SUPPORT EMAIL]