Rules and procedures of moderation
of Aitu social netwok resources
1.1. The Aitu mobile application (hereinafter referred to as the 'Application') is a mobile application available through app stores and web platforms. It is an informational application developed for mobile devices running on the Android or Apple iOS operating systems.
1.2. The Moderation Rules for the Aitu mobile application (hereinafter referred to as the 'Rules') have been developed by the company LLP “BTS Digital” (hereinafter referred to as the 'Company') with the aim of regulating the activities of moderators within the framework of the Application.
1.3. These Rules encompass a set of regulations pertaining to the posting of materials (hereinafter referred to as 'Content') in the Application, which are mandatory for all Application Users to adhere to. The Rules also extend to relationships involving third parties, whose rights and interests may be affected as a result of the activities of Application users.
2.1. Establishment of a unified set of rules that define the procedures and specific features for posting content in the Application.
2.2. Structuring the possible types of violations permitted by Application users in terms of content and quality of content posted in the Application, providing definitions for 'undesirable' and 'prohibited' content that trigger moderator actions.
2.3. Determining the nature and scope of undesirable/prohibited content sufficient for applying corresponding administrative sanctions against Application users.
User Rights and Responsibilities
3.1. The system employs technical solutions that carry out automatic censorship of textual and visual materials using machine learning (ML) algorithms, reducing the costs of manual moderation and preventing the publication of undesirable material. Control and the final decision on material (content) placement on the site are conducted by a Moderator and follow a post-moderation approach in accordance with these Rules.
4.1. Moderation - the control of compliance with the Application's rules, the removal of spam messages, the filtering of negative information, offensive language, and other information that is undesirable according to the company's policy and applicable legislation. Moderation also refers to checking the adherence to the terms of the Application's user agreement.
4.2. Moderator - an employee of the Company responsible for enforcing the rules in the Application.
4.3. Username - a pseudonym and/or user name in the Application.
4.4. Ban - the deprivation or restriction of certain user rights in the Application for a limited or permanent period.
4.5. Public Channel - a public tool in which the author unilaterally shares content with their subscribers.
4.6. Public Group - a public tool for real-time messaging or content sharing.
4.7. Material - information uploaded by the User or the Application while using the Application. Content can include texts, audio and video files, graphic images, animations – everything that a User can read, see, or hear in the Application.
4.8. Authorized Body - the central executive body responsible for state regulation in the field of online platforms and online advertising.
4.9. ML - machine learning algorithms.
5.1 Prohibited content includes, but is not limited to, the following:
Incitement, promotion, or advocacy of violent alteration of the constitutional system, undermining the integrity of the Republic of Kazakhstan, endangering state security, war, social, racial, national, religious, class, and familial superiority, the cult of cruelty and violence, suicide, pornography, narcotics, psychotropic substances, their analogs, and precursors, separatism, fraud that contributes to the disruption of interethnic and interfaith harmony, as well as expressions that cast doubt on the statehood and territorial integrity of the Republic of Kazakhstan, information revealing state secrets or other legally protected secrets, and any other information prohibited by the laws of the Republic of Kazakhstan.
7.1. Upon registration in the Application, the User is obligated to familiarize themselves with the User Agreement, accepting the rights and responsibilities associated with the use and operation of the Application.
7.2. After registration, the User obtains the right to create, populate, and use personal, non-commercial information space within the Application, which includes the User's Personal Page and those of other Users, access to channels and public groups, and more.
7.3. Before posting information and objects (content) in the Application, the User must assess their legality and compliance with the Application's User Agreement.
7.4. The User bears full responsibility for their actions related to the creation, posting, storage, and transmission of content in the specified sections of the Application in accordance with the User Agreement, other special documents of the Application Administration, and the applicable legislation of the Republic of Kazakhstan.
Moderation Algorithm and Types of Penalty Sanctions
6.1. A Moderator takes actions to protect the rights and interests of Application Users and the requirements of the current legislation of the Republic of Kazakhstan in the event of a violation of the Moderation Rules and/or the terms of the User Agreement.
These actions may be triggered by:
● Complaints or reports from other Application Users or interested parties.
● Self-discovery of a violation.
6.2. The measure of impact on a User whose activities within the Application contradict the provisions of these Rules is at the discretion of the Moderator and may involve several scenarios:
● Removal of undesirable/prohibited content without prior or subsequent notification to the Application User.
● Removal of undesirable/prohibited content with a corresponding warning to the Application User.
● In cases of severe violations, restricting access rights and subsequent blocking of the Application User's profile.
6.3. Limitation or blocking of a user's profile may be a temporary and warning measure. The duration of restricted access or profile blocking is determined by the Moderator in accordance with Appendix #1 to these Rules and is communicated to the User after the sanction is imposed.
6.4. Processing of complaints through channels:
● Application Users have the functionality for immediate response to content posted in channels through the 'Report' feature.
● Complaints from users are received by the Moderator.
● The Moderator reviews the channel's content and makes a decision based on an analysis of the channel's content. At the Moderator's discretion, the channel may be blocked. In the event of a channel being blocked by a moderator, channel subscribers or users who discover the blocked channel cannot enter the blocked channel. However, the channel owner can still enter the channel but without the ability to post.
● After a channel is blocked, the Moderator specifies the reason for the account/channel/group block. Users have the option to appeal through the chat in Aitu @moderator_group or by emailing email@example.com within the Application for the unblocking of the account/channel/group.
6.5. Processing of complaints in Public Groups:
● Application Users have the functionality for immediate response to public groups and content posted in public groups through the “Report” feature.
● Complaints from users are received by the Moderator.
● The Moderator reviews the content in the public group and makes a decision based on an analysis of the content in the public group. The Moderator then blocks the public group/accounts of users who offend others and violate the terms or rules of using the Application.
6.6. At the discretion of the Moderator, the publication of nude images may be allowed as an exception for educational, artistic, documentary, and scientific materials if its use is justified.
6.7. Types of penalty sanctions:
● Warning - ban/temporary ban*.
● Direct post removal or requesting removal of a message that contains prohibited content.
● Blocking users and channels.
● Unlocking users in the administrative panel after the restriction period.
6.8. The Company reserves the right to suspend the operation of accounts in the Application that post and distribute unlawful content or information deemed cyberbullying towards children based on a decree from the Authorized Body.
6.9. The Company undertakes to process all received complaints within 20 calendar days and provide an appropriate response to the User regarding the complaint.
6.10. Users receive an automatic confirmation of the receipt of their complaint.
8.1. Publication of information on safety and preventive rules:
9.1. The User agrees with the Company's right to restrict or block access to the Application or take other measures against a User who violates the terms of the User Agreement, the rules of this Moderation Policy, or the norms of current legislation, or the rights of third parties protected by law in the event of a substantiated complaint from such third parties. The nature of these measures, including the duration and level of access restrictions, is determined by the Company at its sole discretion and can be applied without prior or subsequent notification to the User and without explanation of the reasons.
9.2. The measure of impact on an Application User who violates the rules of this Moderation Policy is determined by the Moderator in accordance with Appendix №1 to this Policy. Upon request, the Moderator may inform the User of the nature of the violations committed by the User that prompted the actions taken by the Moderator.
9.3. In the event of a violation of the terms of the User Agreement, the Company has the right to take the necessary measures to protect these terms and its interests, regardless of the statute of limitations for violations. Thus, the Moderator's inaction in this case does not deprive them of the right to take necessary actions later and does not indicate the Company's waiver of its rights in the event of similar violations in the future.
9.4. The Moderator has the right to control the content within the limits established by the Company and is not responsible for all materials stored, posted, published, or transmitted by Users using the Application's services.
9.5. The Company's right to moderation is not an obligation to monitor all materials posted by Users in the Application, published, or transmitted using the Application's services. Moderation is carried out within the limits established by the Company.
Appendix №1 to the Policy on the Moderation of Aitu Social Network Resources
8.1.1. Regular publication of safety rules information in publicly accessible official sources.
8.1.2. Warning users about various types of fraud and providing instructions on what to do and where to report when encountering suspicious content or activities.
8.2. Utilization of machine learning (ML) algorithms for automatic detection and hiding of suspicious content:
8.2.1. Development and use of ML algorithms that automatically analyze content, identify suspicious posts or activities, and hide them from users.
8.2.2. Continuous updates and improvements to ML algorithms for effective detection of new forms of suspicious content.
8.3.1. Maintaining a team of moderators who manually review and analyze content and information. Moderators take action to hide or remove suspicious content and information that were not detected by the automatic ML algorithm.
8.3.2. Regular training of moderators to ensure consistency and proper application of moderation rules and procedures.