Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
TikTok head of trust and safety Suzy Loftus made a stop in Salt Lake City on Tuesday for a private press event where she spoke about the safety features on the app. But Utah lawmakers said the features available don’t go far enough to protect kids.
Salt Lake City is one of the 17 cities Loftus and her team have visited or will visit before the year’s up. When asked how Utah got put on the list, Loftus said, “I think it’s about where we have just amazing communities of creators who have stories to tell and it’s really driven by the creators in the community.”
The stop comes as Utah has filed two lawsuits against the social media giant. The state has also passed first-of-its-kind legislation aimed at protecting children from addictive features on apps and giving parents tools to supervise their kids’ accounts. The law also makes it so children and their parents can bring a private right of action forward against social media companies under certain conditions.
In addition to speaking with Loftus and two TikTok creators Briel Adams-Wheatley and JT Laybourne at Tuesday’s event, the Deseret News spoke with Rep. Jordan Teuscher, Sen. Mike McKell and Margaret Busse, executive director of Utah’s Department of Commerce about the app’s safety.
At the press event, Loftus said 1.1 million Utahns are on TikTok. The heart of TikTok’s safety policies starts with the community guidelines, she said. “We’re a community and we have rules like no bullying, harassment, dangerous behavior.”
Some technology can recognize content that goes against community guidelines before it is posted, she said. In other cases, like misinformation or bullying, Loftus said when the tefchnology can’t tell, the content goes to a human moderator (and for misinformation, it goes to fact checkers).
For younger users, she said, TikTok has more restrictions.
“’Oftentimes people will think their experience on TikTok is the same thing a 13-year-old or a 15-year-old has experience on TikTok,” said Loftus. “And it is a much more restrictive experience for teens.”
Loftus said for account users under 16 years old, the accounts are private by default, their TikToks cannot be downloaded and there is no direct messaging. For those under 18, the daily screen time is set to 60 minutes and Loftus said there are no push notifications at night.
These features are available for use by all accounts — Adams-Wheatley said it was “game-changing for me. I didn’t have to see the comments I didn’t want to see.” Laybourne said when his 13-year-old signed up for the app, the features helped make it a positive experience.
As for how TikTok app determines how old users are, Loftus said when people sign up for the app, they have to put their birthdate. Then afterward, she said TikTok monitors and looks for clues to see how old users are based on what they put in their bios or in their videos.
Busse said there is a difference between age gating (asking for an age) and age verification (verifying the age of a user) — and pointed toward Utah’s law requiring accurate age verification.
“There are lots of different kinds of technologies to be able to do age verification that today these platforms could start using if they wanted, but they don’t want to,” she said.
In order for the app to be safe, Busse said it needs to have meaningful age verification and also put in place certain settings to protect children’s data and to also “get rid of their addictive features — period, full stop. And those include push notifications, infinite scroll, autoplay, those kinds of things.”
“The very act of being on the platform, of having these addictive features, just being on the app a lot, we know can lead to mental health issues,” Busse later said.
When asked if Loftus could foresee a future where parents could potentially shut off the algorithm, she responded by referencing the default screen limit. She said it’s an ongoing discussion and the screen limit can be used by anyone. She said the public policy team works with lawmakers.
Teuscher, R-South Jordan, and McKell, R-Spanish Fork, are two of the lawmakers who have worked most on Utah’s social media legislation.
Sharing a story, Teuscher said one day students found out their teacher had cancer and wanted to show support. These students got together on social media to do so — and he said this is an example of where social media can work for good.
When looking into social media, Teuscher said the biggest issues he and others found were related to the design elements that have the intent of keeping children on the platform — which causes the harm, he said.
“I would love to see these social media companies like TikTok actually recognize that these are a problem, especially for minors, and put guardrails in place to stop those things — and that’s what we’re not seeing,” said Teuscher.
Taking away the design elements that are addictive would make it easier for teens to stay off the platform, he said, because they would be able to choose to do that. When speaking to his constituents, Teuscher said he hears from parents that the algorithms are too powerful and kids find ways around restrictions.
“They needed the government to come in and put some guardrails in place to be able to protect kids,” Teuscher said. These restrictions can, in turn, help parents be proactive with their children’s social media usage.
When passing the legislation with these restrictions, McKell said the state showed what its compelling interest is.
“The overriding concern is that kids are on these platforms way too long and because they’re on the platform way too long, that impacts performance in school, that impacts sleep at night,” he said.
Addressing the issue needs to be all hands on deck, said McKell, explaining though it’s called social media, companies mine the data of children users. He said the government regulates harmful products like tobacco and alcohol, and social media shouldn’t be any different.
“When we look at the data, a child that’s online for longer than three hours, their mental health starts to decline,” he said.
As for whether or not these social media companies are doing enough, McKell said, “Let me just be clear on this. The social media companies are not negotiating in good faith. They haven’t from day one, so I don’t expect them to do that in the future.”
While McKell is optimistic when he sees social media companies put forward tools that make their product less addictive, he said, “At the same time, we need to hold them accountable.” Referring to TikTok specifically, McKell said unless he and others see “a really good solution, we continue moving forward.”
Utah won’t sit back and just watch what the courts do, said Teuscher. The state will work side-by-side with other states to lead out in protecting children. Both Teuscher and McKell pointed toward Utah’s website socialharms.utah.gov for more information.
The Utah Department of Commerce houses the Utah Division of Consumer Protection which sued TikTok — the Utah Attorney General’s Office filed on behalf of the division.
Busse said they are waiting on two legal decisions — in the first suit filed, the state is waiting for the judge to determine whether or not the suit will be dismissed. She said she feels confident the suit will survive.
“We actually feel very confident in part because Meta’s survived the motion to dismiss,” said Busse, adding other states have seen their lawsuits survive. In the second lawsuit, which focuses on the TikTok Live feature, a judge is currently weighing what parts if any of the lawsuit to unredact. Portions of the suit were blocked out so the public could not read them.