Social Media’s Algorithm Under Criticism
Social media violence has become one of the main concerns in our societies, shaping personalities, embedding certain ideas and kids are the most influenced.
The Algorithm violence shift
Back in 2022, Cai, a 16-year-old, was scrolling through his social media when he faced a shift in his feed. Content immediately shifted from innocuous videos of cute dogs into disturbing content, including violent incidents and misogynistic rants. “I found myself asking, why me?” the kid said, surprised by the drastic change in his recommendations.
In Dublin, Andrew Kaung was working as a user safety analyst at TikTok. During his time there, Kaung and a colleague investigated the app’s recommendations for young users in the UK. The research they shared with BBC Panorama uncovered algorithm violence: teenage boys were frequently shown violence on social media, pornography, and misogynistic views. Meanwhile teenage girls received recommendations that were significantly far from violent content on social media.
Social media Platforms should take seriously the issue of social media and violence, during his time at Meta, Kaung faced the problem of reliance on user reports to alarm social media violence. Kaung noted that despite Meta’s effort to regulate algorithm to predict violence, some harmful content slipped through until it was reported by users.
Many employees from TikTok and Meta have supported Kaung’s concerns, asserting that big social media companies often do not give much attention to the impact of their algorithms on kids. According to UK’s media regulator, Ofcom, algorithm violence is becoming more prominent and recommending harmful content to children, raising serious safety concerns.
“Companies have been turning a blind eye and have been treating children as they treat adults,” says Almudena Lara, Ofcom’s online safety policy development director.
Responses and Future Regulations
TikTok claimed that it approximately removes about 98% of violence content on social media, Meta, which owns Instagrams, also highlights its extensive toolkit designed to ensure age-appropriate content generation for teenagers and prevent social media violence. However, despite the efforts, users like Cai continue to encounter undesirable violent content, in addition to content that supports misogyny in social media.
Disturbing videos and posts keep on showing for teens in content recommendations on both platforms TikTok and Instagram, so even after attempting to adjust the algorithm and report unwanted content, their feeds remain full of violent content they don’t want to encounter. UK legislation keeps on trying to limit social media violence, the upcoming legislation is requiring social media companies to verify the age of users and limit the recommendation of harmful content to minors.
Final Thoughts
Not putting a limit for social media violent content will drastically impact the society, by embedding misogyny ideas about genders, and shaping toxic personalities of men. Therefore, Social media companies like TikTok and Meta should work heavily on imbedding more safety features, However, the effectiveness of these measures remains questionable among experts and users as well.
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Tech sections to stay informed and up-to-date with our daily articles.