(no subject)

Oct 14, 2024 14:00

TikTok admits in their own research:

As TikTok’s 170 million U.S. users can attest, the platform’s hyper-personalized algorithm can be so engaging it becomes difficult to close the app. TikTok determined the precise amount of viewing it takes for someone to form a habit: 260 videos. After that, according to state investigators, a user “is likely to become addicted to the platform.”

***

Another internal document found that the company was aware its many features designed to keep young people on the app led to a constant and irresistible urge to keep opening the app.

TikTok’s own research states that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety,” according to the suit.

In addition, the documents show that TikTok was aware that “compulsive usage also interferes with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.”

***

The app lets parents place time limits on their kids’ usage that range from 40 minutes to two hours per day. TikTok created a tool that set the default time prompt at 60 minutes per day.

Internal documents show that TikTok measured the success of this tool by how it was “improving public trust in the TikTok platform via media coverage,” rather than how it reduced the time teens spent on the app.

After tests, TikTok found the tool had little impact - accounting for about a 1.5-minute drop in usage, with teens spending around 108.5 minutes per day beforehand to roughly 107 minutes with the tool. According to the attorney general’s complaint, TikTok did not revisit this issue.

One document shows one TikTok project manager saying, “Our goal is not to reduce the time spent.” In a chat message echoing that sentiment, another employee said the goal is to “contribute to DAU [daily active users] and retention” of users.

TikTok has publicized its “break” videos, which are prompts to get users to stop endlessly scrolling and take a break. Internally, however, it appears the company didn’t think the videos amounted to much. One executive said that they are “useful in a good talking point” with policymakers, but “they’re not altogether effective.”

***

The multi-state litigation against TikTok highlighted the company’s beauty filters, which users can overlay on videos to make themselves look thinner and younger or to have fuller lips and bigger eyes.

One popular feature, known as the Bold Glamour filter, uses artificial intelligence to rework people’s faces to resemble models with high cheekbones and strong jawlines.

TikTok is aware of the harm these beauty filters can cause young users, the documents show.
[...]
Еhe documents showcase another hidden facet of TikTok’s algorithm: the app prioritizes beautiful people.

One internal report that analyzed TikTok’s main video feed saw “a high volume of … not attractive subjects” were filling everyone’s app. In response, Kentucky investigators found that TikTok retooled its algorithm to amplify users the company viewed as beautiful.

“By changing the TikTok algorithm to show fewer ‘not attractive subjects’ in the For You feed, [TikTok] took active steps to promote a narrow beauty norm even though it could negatively impact their Young Users,” the Kentucky authorities wrote.

***

TikTok is well aware of “filter bubbles.” Internal documents show the company has defined them as when a user “encounters only information and opinions that conform to and reinforce their own beliefs, caused by algorithms that personalize an individual’s online experience.”

The company knows the dangers of filter bubbles. During one internal safety presentation in 2020, employees warned the app “can serve potentially harmful content expeditiously.” TikTok conducted internal experiments with test accounts to see how quickly they descend into negative filter bubbles.

“After following several ‘painhub’ and ‘sadnotes’ accounts, it took me 20 mins to drop into ‘negative’ filter bubble,” one employee wrote. “The intensive density of negative content makes me lower down mood and increase my sadness feelings though I am in a high spirit in my recent life.”

Another employee said, “there are a lot of videos mentioning suicide,” including one asking, “If you could kill yourself without hurting anybody would you?”

In another document, TikTok’s research found that content promoting eating disorders, often called “thinspiration,” is associated with issues such as body dissatisfaction, disordered eating, low self-esteem and depression

Despite these heedings, TikTok’s algorithm still puts users into filter bubbles. One internal document states that users are “placed into ‘filter bubbles’ after 30 minutes of use in one sitting.” The company wrote that having more human moderators to label content is possible, but “requires large human efforts.”

***

TikTok has several layers of content moderation to weed out videos that violate its Community Guidelines. Internal documents show that the first set of eyes aren’t always a person from the company’s Trust and Safety Team.

The first round typically uses artificial intelligence to flag pornographic, violent or political content. The following rounds use human moderators, but only if the video has a certain amount of views, according to the documents. These additional rounds often fail to take into account certain types of content or age specific rules.

According to TikTok’s own studies, the unredacted filing shows that some suicide and self-harm content escaped those first rounds of human moderation.
[...]
TikTok acknowledges internally that it has substantial “leakage” rates of violating content that’s not removed. Those leakage rates include:
35.71% of “Normalization of Pedophilia;”
33.33% of “Minor Sexual Solicitation;”
39.13% of “Minor Physical Abuse;”
30.36% of “leading minors off platform;”
50% of “Glorification of Minor Sexual Assault;”
and “100% of “Fetishizing Minors.”

интернеты, english

Previous post Next post
Up