TikTok says it removed 104M videos in H1 2020, proposes harmful content coalition with other social apps – ClearTips

TikTok says it removed 104M videos in H1 2020, proposes harmful content coalition with other social apps – ClearTips

As Tech Leviathan, the future of Ticketok ownership of ByteDance, continues in the meeting room between investors and government officials, the video app has published its latest transparency report today. In total, over 104.5 million videos were taken down; It had approximately 1,800 legal requests; And received 10,600 copyright notices for the first half of this year.

In addition, and possibly to offset the high number of illegal videos, and coinciding with an appearance today in front of a parliamentary committee in the UK over harmful content, Tickcock also announced a new initiative – possibly other social In partnership with apps – against harmful content.

The Transparency Report figures underline an important aspect around the impact of the popular app. The government may want to shut down Tickcock over national security concerns (until ByteDance finds a new non-Chinese control framework that satisfies lawmakers).

But in fact, like other social media apps, Tiktok has one more but no fire to fight: it struggles with a lot of illegal and harmful content published and shared on its platform, and as it grows in popularity Growing up (now it has more than 700 million users globally), this problem will also continue to grow.

This is something that Tiktok sees will be an ongoing issue for the company, regardless of how it is owned outside China. While a major issue surrounding TikTok’s ownership relates to its algorithms and whether these can be part of any deal or not, the company has tried to make other efforts to get more information about how it works. Does. Earlier this year it opened a Transparency Center in the US, stating that it would help experts to observe and explain how it handled the content.

TikTok stated that a total of 104,543,719 videos that TikTok removed globally for violating community guidelines or its terms Service made Less than 1% of all videos uploaded to TikTok give you some idea of ​​the broader scale of the service.

Video volume has been more than doubling in the last six months, a reflection of how the total volume of video has doubled.

According to the final transparency report published by the company, in the second half of 2019, the company took over 49 million videos, (I don’t know exactly why, but it took too much time to publish that previous transparency report, which was in July Revealed in 2020.) The proportion of total videos taken down was roughly equal to the last six months (“less than 1%)”.

TIC Toc He said that 96.4% of the total number was removed before being reported, while 90.3% were removed before they get any idea. It does not specify whether these were found via automated systems or from human intermediaries or a mixture of both, but it appears to have switched to algorithm-based moderation in at least some markets:

“As a result of the coronavirus epidemic, we relied more on technology to detect and automatically remove content violations in markets such as India, Brazil and Pakistan,” he said.

The company notes that the largest category of videos removed was 30.9% around adult nudity and sexual activity, with 22.3% of minor defenses and 19.6% of illegal activities. Other categories included suicide and suicide, violent content, hate speech and dangerous individuals. (And videos can be counted in more than one category, it is noted.)

The biggest genesis market for deleted videos is the one in which Tikok is banned (perhaps unsurely): India took the lion’s share of the video at 37,682,924. On the other hand, the US removed 9,822,996 (9.4%) videos, making it the second largest market.

Currently, it seems that misinformation and disinformation are not primarily misuse of TicTalk, but they are still significant numbers: some 41,820 videos (less than 0.5% removed in the US) have misinterpreted Tictoc and distortion Violated policies, the company said.

Some 321,786 videos (about 3.3% of US content removal) violated their abusive language policies.

Legal requests state that with 1,768 requests for user information from 42 countries / markets in the first six months of the year, 290 (16.4%) are coming from US law enforcement agencies, including 126 subpoenas, 90 searches. Huh. Warrant and 6 court orders. In total, it had 135 requests from government agencies to ban or remove content from 15 countries / markets.

Tiktok said the Harmful Content Coalition is based on a proposal that Tickcock’s acting chief in the US, Vanessa Pappas, sent to nine officials on other social media platforms. It does not specify what, nor what the response was. As we learn more we are asking and will update.

Social media alliance proposed

Meanwhile, the letter, published in full and reprinted below by TickTock, underscores the response to how active and successful social media platforms are trying to reduce some of the misuse of their platforms. This is not the first such effort – many other such efforts have been made where many companies, competitors preceding consumer engagement, come together on a united front to deal with such things as misinformation.

It is specifically identifying non-political content and “coming up with a collaborative approach to early identification and notification among industry participants of extremely violent, graphic content, including suicide.” The MoU proposed by Papps suggested that social media platforms communicate to keep each other informed about content – a smart move, considering how much is shared across other platforms from other platforms.

Another example of the company’s efforts at the Harmful Content Alliance is how social media companies are trying to take their own initiative and show that they are trying to be responsible, a core of lobbying to control governments Is the way. Content that is shared on its platforms with Facebook, Twitter, YouTube and others continues to remain in hot water – despite their efforts to curb abuse and manipulation – it is unlikely that it will have any last word on it .

Full memo below:

Recently, social and content platforms have once again been challenged by the posting and cross-posting of explicit suicide content that has impacted all of us – as well as our teams, users, and wider communities.

Like each of you, we diligently worked to reduce its spread by removing the original content and its many variants, and preventing it from being seen or shared by others. However, we believe that each of our individuals is important to protect our own users and significantly through a formal, collaborative approach and informational approach among industry participants of extremely violent, graphic content, including mass community suicide. Will be increased.

For this, we would like to propose cooperative development of a Memorandum of Understanding (MoU) which will allow us to quickly inform each other of such material.

Separately, we are conducting an in-depth analysis of the incidents as they relate to recent suicide content sharing, but it is clear that early detection platforms allow for more rapid response to suppress highly objectionable, violent content She gives.

We are clearly defined for any such negotiation mechanism as to what type of material can be captured, and move us quickly to inform each other of what is being captured by the MOU Must be agile enough to allow. We also appreciate that there may be regulatory hurdles in such areas that project further engagement and consideration.

To that end, we would like to convene a meeting of our respective trust and security teams so that such mechanisms can be further discussed, which we believe will help improve security for our users.

We are working together to help protect your positive feedback and our users and the wider community.

With devotion,

Vanessa Pappas
Head of tiktok

more to come.

Leave a Reply

Your email address will not be published. Required fields are marked *