Social media giants and online service providers are closely monitoring online content to ensure that there is no misinformation or fake news being disseminated on their platforms as Thailand gears up for Sunday’s election.
They said they will cooperate with the authorities to remove content deemed to be in violation of the laws.
Facebook said the company was in touch with a variety of stakeholders, academics and electoral organisations to collect their feedback, viewpoints and ideas on how to protect the integrity of the election experience and to encourage civic participation in local elections.
“Our Community Standards outline what is and isn’t allowed on Facebook. We will remove content that violates these standards when we are made aware of it,” Facebook told The Nation.
“We have a clear and consistent government request process, which is no different in Thailand than in the rest of the world, and we report the number of pieces of content we restrict for contravening local law in our Transparency Report,” the company said.
The Election Commission recently set up a war room to monitor whether content on social media violated election laws or spread fake news.
Facebook said it is focused on building an informed electorate and aimed to empower people in Thailand with helpful tips to help them identify false news.
Tackling misinformation is a complex issue and is a shared effort involving academics, civil society and the government, news media and technology organisations, the social media giant said.
As part of this effort, Facebook, in partnership with the Election Commission, the Ministry of Digital Economy and Society, Chulalongkorn University's Faculty of Mass Communications and Arts, and Thai News Agency's Sure and Share Center, recently provided tips to help people spot false news online.
“This is part of our ongoing commitment to the community in Thailand and we aim to build on these efforts through local initiatives,” Facebook said.
Thailand is ranked the eighth biggest user of Facebook with 52 million accounts, according to a survey by Hootsuite and We Are Social as of April 2018.
Meanwhile, Google said the company has reviewed requests from courts and government agencies around the world closely to determine if content should be removed from Google products because it violates a law or its product policies, a Google spokeperson told The Nation.
In its Transparency Report, Google disclosed that in the first six months of 2018 Thailand sent 150 requests to the company to remove content. Google, however, did not disclose information on removal of content.
According to a report from Google, content that Google removed from its platform in 2017 were mostly related to national security and almost 18,000 contents out of 35,793 requests were removed.
It said governments contact Google with content removal requests for a number of reasons.
“Governments ask us to remove or review content for many reasons. Some requests allege defamation, while others claim that the content violates local laws prohibiting hate speech or adult content,” the company said.
It said that often times, government requests target political content and criticism of the government.
Governments cite defamation, privacy and even copyright laws in their attempts to remove political speech from the company’s services.
The company said it also includes government requests to review content to determine if it violates our own product community guidelines and content policies.
However, the laws surrounding these issues vary by country.
“Our teams assign each request a category, such as hate speech, obscenity and defamation,” the spokesperson said.
The teams evaluate each request and review the content in context in order to determine whether or not content should be removed due to violation of local law or its content policies, it said.
Note that we did not begin providing a reason for the request until the December 2010 reporting period, said Google.
Meanwhile, Line Thailand said that all information on Line Today came from its content publishers, hence it was believable.