Facebook announced they are in the process of hiring 3,000 more who would be part of the community team tasked with monitoring the standard of videos posted on its site. The above can be seen as a response after the social media giant came for a lot of criticism for hosting videos on its platform that are often deemed inappropriate for the wider audience.
What can be termed to have acted as the catalyst for ushering in a change is the recent depiction of a live video of a father in Thailand killing his 11-year old daughter before killing himself. The video remained visible for an entire day before it was brought down.
With the new recruits who would be joining a team comprising of another 4,500, Facebook wishes to make it easy for users to report such video. The social media giant also stated they are aiming for their reaction to be equally swift, whether be it taking the entire post down or to reach out to those who they believe needs help.
Facebook had earlier announced videos happen to be a key focus area for the company, something that continues to be so. However, those that don’t match Facebook’s community guidelines are liable to be removed, officials revealed.
Among the videos that are likely to be removed include those that depict violence such as murders and suicides, hate speeches, those that propagate racist ideas, propaganda videos of terrorists and so on.
Facebook, however, stated the matter can still be tricky as they do not wish to be seen as censoring propagation of free speech and ideas. Live videos of say, police highhandedness or anyone committing a crime still needs to be addressed to establish the rule of law.
Facebook stated while they would be walking the fine line between clamping down on videos that depict violence, those they believe can help make the society a better place would be allowed to prevail. Take for instance a journalist reporting live on a crime being committed, or an act of injustice meted out to a citizen.
However, the challenge could still be to allow such videos to remain in the live news feed while keeping young innocent minds out of its purview. With the minimum age at just 13 for anyone to join Facebook, there should be a way to prevent the early teens to be exposed to inappropriate content.
Maybe they can just blur those video that the wider populace believes isn’t fit for the teens to view, requiring further authorization to view them and so on. Worth mentioning, Facebook had recently started something similar on Instagram, blurring out those images that other have reported as inappropriate for wider viewing. Maybe that sort of an approach is needed for Facebook as well.