← Back to Privacy Policy
Legal document

Child Safety Policy

Effective Date: 11 May 2026

RUTIN is committed to protecting minors and preventing misuse of the platform in ways that could harm children. This policy explains how RUTIN safeguards younger users and prevents child exploitation risks.

Section 1

1. MINIMUM AGE REQUIREMENT

Users must be at least:

16 years old

to create an account on RUTIN.

Accounts suspected to belong to underage users may be reviewed or restricted.

Section 2

2. ZERO TOLERANCE FOR CHILD EXPLOITATION

RUTIN strictly prohibits:

  • child sexual abuse material (CSAM)
  • sexualized content involving minors
  • grooming behavior
  • attempts to contact minors inappropriately
  • requests for private images
  • harassment of younger users

Violations lead to immediate enforcement action.

Section 3

3. REPORTING CHILD SAFETY VIOLATIONS

Users should report:

  • suspicious behavior
  • inappropriate messages
  • exploitative content
  • minor-targeting activity
  • identity misuse involving minors

Reports help protect the platform community.

Reports may be submitted via:

safety@rutin.co.in

Section 4

4. PLATFORM RESPONSE TO CHILD SAFETY RISKS

When violations are detected:

RUTIN may:

  • remove content immediately
  • restrict identities
  • suspend accounts
  • freeze communities if required
  • preserve evidence for legal review
  • report cases to authorities when required
Section 5

5. COMMUNITY LEADER RESPONSIBILITIES

Community leaders must:

  • monitor unsafe behavior
  • report suspicious activity
  • avoid collecting minor personal data
  • follow platform moderation rules

Leaders are not permitted to privately investigate users.

Serious cases must be escalated to the platform.

Section 6

6. PRIVATE MESSAGE SAFETY

Users must NOT:

  • pressure minors into private conversations
  • request personal contact information
  • request photos or sensitive details
  • move conversations off-platform for unsafe reasons

Messaging misuse leads to enforcement action.

Section 7

7. IDENTITY SAFETY CONTROLS

Because RUTIN supports multiple identities:

The platform monitors identity misuse patterns that may indicate:

  • grooming attempts
  • impersonation targeting minors
  • trust manipulation
  • identity switching abuse

Such behavior triggers review.

Section 8

8. COMMUNITY SAFETY CONTROLS

Communities must NOT:

  • promote unsafe environments for minors
  • encourage exploitation
  • allow inappropriate content involving minors

Communities violating safety rules may be:

  • restricted
  • moderated
  • frozen
  • removed
Section 9

9. DATA PROTECTION FOR MINORS

RUTIN minimizes collection of personal data relating to minors.

The platform does NOT intentionally:

  • collect unnecessary sensitive minor data
  • sell minor data
  • expose private minor identity information
Section 10

10. LAW ENFORCEMENT COOPERATION

When required by law:

RUTIN may cooperate with:

  • Indian law enforcement authorities
  • cybercrime units
  • child safety protection agencies

to protect minors.

Section 11

11. SAFETY REPORT CONTACT

Child safety concerns should be reported to:

Email: safety@rutin.co.in

Platform: RUTIN

Country: India

Urgent cases receive priority handling.

Section 12

12. POLICY UPDATES

This policy may be updated when:

  • laws change
  • platform features expand
  • community governance evolves
  • child safety protections improve

Users may be notified when required.