New update for WhatsApp video places quality at the helm
Facebook and WhatsApp are currently testing two separate new updates, that will look to make both platforms a safer space for people to network, while the latter focuses more on its video-sharing quality.
WhatsApp Messenger, a subsidiary of Facebook since 2014, is developing a new feature allowing users to control the quality of videos sent to each other. The future update for WhatsApp video was spotted by WABetainfo, a website that tracks all the new updates in the beta version of the instant messaging application.
A pop-up message will ask users to choose one of three quality options: Auto, Best Quality and Data Saver.
“Auto mode” will use compression as is the current case for WhatsApp video.
“Best Quality” mode will enable users to share the video as per the resolution stored within the device. While quality will be much better, users will notice longer periods of uploads compared to Auto Mode, especially for mobile devices running on slow internet speed.
The “data saver” mode will send the video in the most compressed version, compared to the other two options.
This new feature is still under development and is currently not functional.
On the Facebook front, the company is developing a new feature that aims to combat violent extremism. The update – which is currently in its trial phase in the U.S. — will warn users against “extremists’ content” they might have been exposed to.
Matt Navarra, a social media consultant, leaked screenshots on Twitter in which a notice asked, “Are you concerned that someone you know is becoming an extremist?” Another alert was “you may have been exposed to harmful extremist content recently.” Both included links to “get support.”
The feature that was spotted by users was followed by a clear statement by Facebook Spokesperson Andy Stone noting that “any violence or hate related search term gets redirected towards resources, education, and outreach groups that can help.”
“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,” he added.
The company is currently partnering with various NGOs and academic experts to help create a safer space using this update.
Facebook the trillion dollar company, highlighted that it is aiming to identify both users who may have been exposed to rule-breaking extremist content and users who had previously been the subject of Facebook’s enforcement.
The Big Tech company has long been under pressure from policymakers and civil rights groups to fight radicalism on its platforms, especially after the January 6 Capitol riot in the U.S.
The world’s largest social media network said the efforts were part of its commitment to the “Christchurch Call to Action,” a campaign involving major tech platforms to counter violent extremist content online that was launched following a 2019 attack in New Zealand that was live streamed on Facebook.