Safety & Trust Policy
Last Updated: April 12, 2026
1. Our Commitment to Safety
Kalaloka ("Platform"), a product of DigiYogi, Bangalore, Karnataka, India, is committed to providing a safe, welcoming, and trustworthy environment for all users. We believe that creative expression thrives in spaces where people feel safe. This Safety & Trust Policy outlines the measures we take to protect our community.
This policy operates in conjunction with our Community Guidelines, Acceptable Use Policy, and Terms of Service, and is informed by the Information Technology Act, 2000 and the IT (Intermediary Guidelines) Rules, 2021.
2. Platform Safety Measures
- Account Security: OTP-based authentication, encrypted JWT tokens with version-based revocation, and secure session management.
- Encryption: All data transmitted between your device and our servers is encrypted using HTTPS/TLS. Data at rest is encrypted where applicable.
- Access Controls: Role-based access controls for administrative functions (Super Admin, Moderator, Content Manager, Support).
- Rate Limiting: Automated rate limiting on all interactive features to prevent abuse, spam, and automated attacks.
- Report System: In-app reporting on all content and user profiles, enabling community-driven safety.
3. Content Moderation
3.1 Moderation Team
Our content moderation team reviews reported content and proactively monitors the Platform for policy violations. Moderators are trained in our Community Guidelines, Indian legal requirements for intermediaries, and sensitivity around cultural and linguistic diversity (particularly Kannada literature).
3.2 AI-Powered Content Screening
- Our AI content analysis system scans published content for quality scoring, sentiment analysis, and potential policy violations.
- Content flagged by AI is queued for human review before action is taken (except for clear-cut cases like CSAM, which trigger immediate removal).
- AI screening covers text content, profile information, and user-uploaded images.
- We continuously improve our AI moderation capabilities, including for Kannada and Hindi language content.
3.3 Moderation Actions
Based on the severity of the violation, we may take the following actions:
- Content warning label or age restriction.
- Content removal.
- Content visibility restriction (changing from public to private).
- Account warning.
- Feature restriction (commenting, messaging, publishing).
- Temporary account suspension (7-30 days).
- Permanent account ban.
4. User Safety Tools
4.1 Blocking
- You can block any user from your profile. Blocked users cannot view your content, send you messages, send you gifts, or interact with your content.
- Blocking is mutual and immediate.
- You can manage your blocked list through your profile settings.
4.2 Muting
- You can mute users to hide their content from your feed and notifications without blocking them.
- Muted users will not be notified that they have been muted.
4.3 Comment Controls
- Authors can disable comments on individual pieces of content.
- Authors can delete inappropriate comments on their own content.
4.4 Visibility Controls
- You can set your content visibility to Public, Followers Only, or Private.
- You can control your profile visibility settings.
5. Emergency Reporting
5.1 Self-Harm and Suicide
If you encounter content that describes or promotes self-harm or suicide, please report it immediately using the in-app report feature and selecting "Self-harm or suicide." We take these reports with the highest priority.
If you or someone you know is in immediate danger, please contact emergency services:
- Emergency: 112 (India unified emergency number)
- iCall: 9152987821 (mental health helpline)
- Vandrevala Foundation: 1860-2662-345 (24/7 mental health support)
- AASRA: 9820466726 (crisis intervention)
5.2 Threats and Violence
Content containing credible threats of violence, incitement to violence, or planning of violent acts will be removed immediately upon verification and reported to law enforcement.
6. Law Enforcement Cooperation
- The Platform cooperates with law enforcement agencies and the Indian judiciary in accordance with applicable laws.
- We comply with valid court orders, search warrants, and information requests from authorized agencies under the Information Technology Act, 2000 and the Code of Criminal Procedure, 1973.
- We may proactively report illegal activity to law enforcement when we become aware of it, particularly in cases involving imminent threats to life, CSAM, or terrorism-related content.
- We retain data necessary for investigations as required by law, including the 180-day data retention requirement under IT Rules 2021 for user registration information.
7. Child Sexual Abuse Material (CSAM)
The Platform has an absolute zero tolerance policy for Child Sexual Abuse Material (CSAM) or any content that sexually exploits or endangers children.
- Immediate Removal: Any CSAM or child exploitation content is removed immediately upon detection, without prior notice to the uploader.
- Account Termination: The uploading account is permanently banned immediately.
- Law Enforcement Reporting: All instances are reported to:
- The National Cyber Crime Reporting Portal (cybercrime.gov.in)
- The National Commission for Protection of Child Rights (NCPCR)
- The Indian Computer Emergency Response Team (CERT-In)
- NCMEC (National Center for Missing and Exploited Children) CyberTipline, as applicable
- Evidence Preservation: Relevant data is preserved for law enforcement investigation as required under Section 67B of the IT Act, 2000 and the POCSO Act, 2012.
8. Terrorism and Extremist Content
- Content promoting terrorism, extremism, or radicalization is strictly prohibited and will be removed immediately.
- Such content is reported to relevant Indian authorities including the National Investigation Agency (NIA) and local cyber crime cells.
- The Platform complies with all takedown orders from the Ministry of Home Affairs and other authorized agencies under Section 69A of the IT Act, 2000.
9. Contact for Safety Concerns
For urgent safety concerns or to report harmful content:
- Email: support@kalaloka.buzz (use subject "URGENT: Safety Report" for priority handling)
- In-App: Use the report button on any content or user profile.
- Grievance Officer: Nagaraj (Founder)
- Platform: Kalaloka (a product of DigiYogi)
- Address: Bangalore, Karnataka, India
For immediate emergencies, please contact local law enforcement (112) before reporting to us.