Top

Effective Stock Habbits

  /  Investing   /  How the Latest TikTok Headlines Relate to Ongoing Tech Policy Debates

How the Latest TikTok Headlines Relate to Ongoing Tech Policy Debates

Jennifer Huddleston

October has seen a number of bad headlines about the popular social media app TikTok. First, 13 states and the District of Columbia alleged that the platform harmed kids and “addicted” them to the app. Then, a Kentucky Public Radio report published documents that the company knew about the potentially negative impact and behavior of the app’s minor users, prompting senators to call for production of the documents in question. These latest claims follow an earlier FTC-led investigation for violations of the Children’s Online Privacy Protection Act (COPPA). All this arrives at a time when the company and its users attempt to rally against a law that would see TikTok face the choice of being sold or banned in the United States.

It seems both policymakers and the public are often combining the wide array of concerns about the popular app. The problem with this line of thinking is that the policy solutions and principles that apply in each case — as well as the underlying questions — are different. As a result, it is important that if there are legitimate policy issues to address, they should be targeted to respond to the concerns lest they cause more damage to other important values such as free speech, innovation, or privacy.

This last week, many of the concerns expressed by the media and policymakers are about the impact the app may have on young people. Such a conversation exists in a broader context around youth online safety. Before these concerns were expressed about TikTok, they were levied at other social media apps popular with young people at the time, including Instagram and even MySpace. As a result, policymakers are quick to respond to new allegations and use them as an opportunity to promote legislation that raises significant privacy and speech concerns.

There are specific requirements around young people’s data that are at issue in the COPPA investigation, which the law was designed to address. If TikTok violated COPPA, then there are existing remedies that can be enforced to respond to the practices and may require changes to the platform.

Many of the allegations about TikTok and young people, however, are not about the type of nuanced data collection practices covered by the Kids Online Safety Act (KOSA), or even the presence of underage users on the app. The recent discourse from the Kentucky Public Radio report and cases brought by the states has resurfaced the refrain of “social media addiction.” But the problem here is that “addiction” has and should be understood within the realm of medical diagnoses. Currently, there is not one such diagnosis accepted in the medical academy.

Again, this debate pre-dates the recent concerns about TikTok. As my Cato colleague Dr. Jeff Singer, MD wrote in a 2018 piece “Stop saying social media ‘addiction,’” “Addiction has a biopsychosocial basis with a genetic predisposition and involves neurotransmitters and interactions within reward centers of the brain. The interaction of these factors has not been established with respect to social media use.” These allegations are also not unique to TikTok. Last year, Meta faced a similar batch of lawsuits from states about the impact of its products on young people including allegations of “addiction.”

Companies continue to evolve their tools to empower parents of young users, as well as users of all ages to be more informed about their choices. These range from screen time limits on certain apps, the changing of defaults for young users, and notices more generally about screen-time usage on smartphones. This allows each family to determine what makes the most sense for them and to curate the tools to respond to their own needs.

For example, a teen on a church mission trip or traveling for a soccer tournament may need to be on her phone to communicate with family or stay entertained on a bus even if it is “late at night.” In other cases, a young person may be using social media frequently to stay in touch with far-away friends or to get help with a medical issue. Policy is unable to properly consider these nuances and instead risks eliminating beneficial uses along with the bad.

Even if it turns out that TikTok violated COPPA, or that the states win the case about its impact on young people, these issues are very different than the one underlying the calls to force TikTok to “divest or ban.” To call such issues a national security concern would indicate that this law is more directed at the type of speech on TikTok and strengthens the case that the law raises significant concerns under the First Amendment. I have more thoroughly discussed the various issues related to speech, competition, and innovation with the “divest or ban” bill in other work. But these latest allegations should emphasize the need for a principled approach. A “divest or ban” would impact the speech of the adult users of the app (who are bringing their own challenge to the law) and does not take into account that many of these allegations have been thrown at other platforms.

There are existing tools to respond to specific concerns about the use of data of underaged users under COPPA. The other allegations have also been expressed against a wide array of social media companies since social media came into existence. Our policymakers should be careful not to conflate the issues at hand and carefully examine if they are best served by policy solutions. 

As with the broader debates about tech policy reflected in these latest concerns about TikTok, policymakers should carefully consider if their proposed policy responses even address the underlying issue and might be better addressed through education, not regulation. They also should consider the longer-reaching impact on key values like speech, innovation, and privacy that could be collateral damage of such policies due to animosity towards one type of technology or one company.

Post a Comment