FeaturedOpinionTechnology

The overreaching TikTok ban could also harm US industry

For all of its interpretations over time, the First Amendment to the U.S. Constitution is pretty straightforward: “Congress shall make no law … abridging the freedom of speech or of the press.”  

This starkly contrasts the recently enacted Protecting Americans from Foreign Adversary Controlled Applications Act, which, in its over 1,800 words, targets TikTok and appears to do precisely what the Bill of Rights forbids Congress from doing.

The bill applies to TikTok, its parent corporation ByteDance, and its subsidiaries, successors and owned and controlled entities. It also applies to other companies “controlled by a foreign adversary” that are presidentially determined to “present a significant threat to the national security of the United States” as detailed in a notice and report to Congress “describing the specific national security concern involved.”

However, despite providing a 90-day window to challenge the determination, the law fails to identify what national security concerns merit app shutdown or divestiture or provide a way of quantifying a “significant threat.” Thus, it is nearly impossible for a company to know if they are venturing precariously close to (or have crossed) the line that may result in their shutdown or divestiture.  

Even more problematically, companies can challenge the determination only after the “significant threat” is outlined in the public report to Congress. However, this places the burden on the respondent instead of the government. It is also easy to imagine the complexity of the litigation that would ensue, making seeking redress difficult for all but the largest firms.

This, though, isn’t the only problem with this law. The law also penalizes entities that “enable the distribution, maintenance, or updating of” proscribed applications via “services” or “hosting services.” This is akin to punishing sneaker companies for enabling murder by selling shoes to a murderer. There is no requirement in the law that these services be provided knowingly for the provider to be sanctioned. 

Even general-purpose service providers could be entangled if they are used by anyone (not just the company) for app distribution. To comply, providers would be forced to inspect every byte of data they transmit to ensure that it doesn’t contain the app or in some way enable its “distribution, maintenance or updating.”

Even more problematic is the insertion of language that limits challenges to the act to only 165 days after enactment. Should this be upheld, it requires anyone who could potentially be ensnared in the future to file with the U.S. Court of Appeals for the District of Columbia in the next five-and-a-half months to seek anticipatory relief. This is particularly problematic for those who may be inadvertently trapped and may not realize their risk or have the resources to spend on challenging a law that poses a vague and unlikely threat to them.

Finally, the law demonstrably restricts speech and the press. Banning TikTok and any entity that would be a covered company, meaning that they allow users to “generate, share, and view text, images, videos, real-time communications, or similar content,” would unquestionably impair speech, as has already been ruled by the U.S. District Court in Washington.

This is bad law, and Congress should act immediately to correct it.

The motivations behind the law are laudable. We can and should take action to prevent societal threats from any source, including social media. To do this, we should decide what actions or activities pose these threats and ban them specifically. Companies should be able to readily determine whether they are on the right side or wrong side of the law at any time. They shouldn’t have to wait for a pronouncement or report from the White House.  

Americans should know why and under what circumstances social media providers will be blocked or banned. We should be able to discuss these conditions specifically as part of a broad national debate and lawmaking process. 

If a company’s app is banned, it should be the responsibility of the government to take action to shut it down. Internet and hosting providers and others should not be conscripted as law enforcers through the threat of staggering fines for services they may not even know they are providing.

While this law was designed to impact TikTok, what it damages may be far more significant.

Jeremy Straub is the director of the North Dakota State University Cybersecurity Institute, an associate professor in the NDSU Department of Computer Science and an NDSU Challey Institute senior faculty fellow. The author’s opinions are his own. 

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Source link

Related Posts

Load More Posts Loading...No More Posts.