In 2024, TikTok has removed more than 30.7 million videos from its platform in Pakistan, reinforcing its commitment to enhancing content moderation and creating a safer online environment. This announcement came through TikTok’s second-quarter Community Enforcement Reports.
According to the report, an impressive 99.5% of these videos were removed before users could report them, and 97% were taken down within 24 hours of being uploaded. This highlights TikTok’s focus on automation and swift action against harmful content.
Globally, TikTok removed over 178 million videos during the second quarter of 2024, 144 million of which were deleted using automated detection technologies. The platform’s moderation efforts ensure that 98.2% of harmful content is now being removed before it reaches users, further demonstrating the effectiveness of their content moderation systems.
These advancements align with TikTok’s strategy of utilizing AI to stay ahead of increasing content volumes and prevent the spread of harmful material. In Pakistan alone, TikTok had previously removed 20 million videos in just three months, showing consistency in its approach.
However, TikTok’s content moderation changes aren’t without consequences. Recently, the platform laid off hundreds of human moderators globally, including around 500 in Malaysia, as AI took over a significant portion of the moderation work. This shift is part of a broader move to optimize operations while balancing regulatory pressures and maintaining the use of human reviewers for appeals and sensitive cases.