What to do if the Supreme Court doesn’t let you Ban TikTok – Asmongold Logic

In an era where social media shapes minds, TikTok—a platform owned by China’s ByteDance—has sparked a heated debate in the U.S. over its influence on youth and national security. This article delves into why the U.S. is considering a TikTok ban, highlighting concerns about foreign media ownership and the platform’s potentially harmful content algorithms.

Why ban TikTok?

The reasons for the proposed TikTok ban include:

  • U.S. Security Concerns: TikTok’s parent company, ByteDance, is based in China, raising concerns about data security and national security.
  • Data Collection: The TikTok app collects a lot of information about the user and data from the device, raising user privacy concerns.
  • Historical Media Ownership Laws: Previously, foreign companies were limited or prevented from owning certain types of media companies in the United States.
  • Impact on Youth: The algorithm is believed to negatively influence younger Americans academically, socially, and politically.
  • Chinese Restrictions: While China bans the social media version of TikTok, promoting STEM education instead, the platform’s global version focuses on entertainment, affecting societal values. American adolescents want to grow up to be social media influencers and Youtubers. China’s adolescents want to grow up to be engineers.

What if the ‘Media’ in social media was considered for TikTok?

Any foreign investment in U.S. companies, including those in the media sector, is subject to scrutiny under the Committee on Foreign Investment in the United States (CFIUS), which reviews transactions that could result in control of a U.S. business by a foreign entity, to determine the effect on national security. Media ownership by foreign entities is regulated under the Communications Act of 1934, as amended by the Telecommunications Act of 1996. The Federal Communications Commission (FCC) restricts foreign ownership of broadcast licenses to a maximum of 25% unless a higher percentage is approved by the FCC itself after determining it would not harm the public interest.

Bad logic according to Asmongold

The philosopher and Twitch streamer Asmongold criticizes the TikTok ban for its narrow focus. He argues that focusing on one company without addressing the broader environment that allows such entities to operate is illogical. Asmongold notes that other companies, also based in China or owned by Chinese entities, have similar access to users’ devices yet are not banned.

 

If you are concerned with China’s influence and TikTok is banned would it prevent another Chinese company from doing the same thing as TikTok? No.

If you are concerned with social media’s impact on society and TikTok is banned would it prevent apps from harming others because of their suggestive algorithm? No.

Addressing the Broader Issue – regulating social media algorithms

Both the United States and China agree on something, the negative effects algorithms on social media have.

YouTube faced legal scrutiny when its recommendation algorithm was accused of promoting ISIS recruitment videos. This lawsuit highlights the profound impact social media algorithms can have on real-world behavior and the urgent need for stricter content regulation.

Facebook’s decision to modify its algorithm to prioritize group content over posts from friends might have causes larger political divides in 2020 and may have lead to larger protests and civil disturbances.

Instead of just focusing on TikTok, congress should make laws to stop any app from being like TikTok and prevent algorithms from having negative impacts on society.

How?

Changes that could fix the issue

Here is how I propose this happen:

  1. Algorithm Regulation: No company should use multi-variable algorithms to suggest content unless it’s for educational purposes and licensed by the FCC.
  2. Limitations on Algorithms: Prevent algorithms that suggest content based on multiple variables, except for educational content approved by regulatory bodies.
  3. Content Grouping Restrictions: Content should not be grouped by algorithms; randomized content must represent at least 5% of all platform content to avoid biased feeds.

Multi-variable Suggestion Algorithms

This would stop suggestion algorithm and TikTok’s addictive “For You” page. TikTok uses predictive algorithms with many variables to keep users on the platform. Single variable algorithms could be allows. Examples include: date posted (most recent), or most viewed.

Stop Grouping Users and Content by Algorithm

Social media platforms could get around an algorithm limits by grouping contents and offering small specialized content to the group. To prevent this, users can not be grouped by an algorithm and grouping content or serving randomized content must have over 5% of all content on the platform.

Educational Exception

Like China, the United States should allow the use of sophisticated predictive algorithms for educational purposes under strict regulation. Educational algorithm licenses should be issued and controlled by the FCC.

New Data Requirements

Companies must disclose how they use data and where it is stored. Non-compliance or violations should lead to removal from software and app marketplaces in the U.S. This would address concerns of a another country controlling United States’ citizen data.

In the European Union data protection and privacy is already in place with heavy fines.

European Union’s Data Requirements

In the European Union, data protection and privacy are governed primarily by the General Data Protection Regulation (GDPR), which came into effect on May 25, 2018. GDPR has set a high standard for data protection globally and includes several key requirements that businesses must comply with when handling personal data. Here are some of the core aspects of GDPR:

  1. Transparency and Consent: GDPR requires companies to be transparent about how they collect, use, and store personal data. It mandates that consent must be clearly and affirmatively given by data subjects—meaning that companies need explicit permission to process personal data, and the purpose of processing must be made clear to the individual.
  2. Right to Access and Rectify: Individuals have the right to access their personal data, know how it is being used, and demand corrections if the data is inaccurate.
  3. Right to Erasure: Also known as the “right to be forgotten,” this allows individuals to request the deletion of their personal data when it is no longer necessary for the purpose it was collected, among other conditions.
  4. Data Portability: This right allows individuals to obtain their data from a service provider in a structured, commonly used, and machine-readable format, and to transfer it to another provider.
  5. Data Protection by Design and by Default: Companies are required to include data protection from the initial design stages of projects and to ensure that only the necessary data for each specific purpose is processed.
  6. Data Breach Notification: GDPR imposes a duty on all organizations to report certain types of data breaches to the relevant supervisory authority, and in some cases, to the individuals affected by the breach.
  7. Data Transfer Restrictions: GDPR restricts data transfer to non-EU countries that do not meet the EU’s adequacy standards for privacy protection.
  8. Penalties for Non-Compliance: GDPR sets out severe penalties for non-compliance, which can go up to 4% of the annual global turnover of the company or 20 million euros, whichever is higher.

Addressing the issues long-term

The debate over banning TikTok underscores a broader dilemma faced by global societies today: how to manage the pervasive influence of social media while safeguarding national interests and individual well-being. While specific actions such as banning a single platform like TikTok might address immediate concerns, they do not resolve the underlying issues presented by the digital age. A more holistic approach is required—one that involves comprehensive legislation to regulate and monitor social media platforms. This approach should ensure transparency in data usage, protect individuals’ rights, and maintain a balance between innovation and privacy. As we navigate these complex issues, it is crucial to foster an environment where technological advancements do not come at the expense of ethical standards and societal welfare.